By default, OverOps only examines and shows you your code. However, there may be times when you want to see 3rd party methods in your error analyses or detect errors in 3rd party libraries. For example, if you’re using OverOps for Spark, you’ll likely want to include the Spark libraries so you can see errors happening in your Spark clusters. This section details how you include these methods and libraries in OverOps.
By default, error analyses in OverOps show you solely your code. However, if you want a full stack trace that includes 3rd party methods, OverOps gives you the option to include 3rd party methods in the call stack of your error analyses. To see the 3rd party methods involved for an error you’re analyzing, click the slider at the bottom of the stack in your error analysis screen.
The 3rd party methods filter
Turning the slider on will include 3rd party methods in your call stack and error analysis. In the call stack, you will see the location of any 3rd party methods involved.
A view of 3rd party methods in the call stack.
Clicking on the 3rd party icon in the call stack will expand it, showing you exactly which 3rd party methods make up that segment of the stack.
An expanded view of 3rd party methods in the call stack.
OverOps does not collect code and variable information for 3rd party methods by default. You can tell OverOps to include the code and variables for particular 3rd party packages in the Advanced Settings however. In the error analysis screen, 3rd party methods will be shown like this unless you decide to include them:
A 3rd party method
If seeing where 3rd party methods were called as part of an error’s call stack is not enough information for you, you can have OverOps fully monitor specific 3rd party libraries. OverOps is specifically designed to find errors within your code, so as a result, it automatically ignores a base list of known packages (e.g. java.lang, scala.lang, org.eclipse, org.apache, etc…). If you wish to override OverOps' base list and monitor these packages or any others, you have the option to do so. Please note, that overwriting the base list means that you will have to manually tell OverOps to include your code packages as well. To overwrite the base list and move to an include/exclude model, please follow these instructions:
- In your OverOps dashboard click on “Add Server” in the top bar.
- Click on the “Advanced Settings” button.
- Click on the “+” mark located next to your secret key:
The Advanced Settings menu
- Enter your code packages and the 3rd party packages you want to monitor in the “Only show the following packages” box.
- Press the “Close” button to save your changes
Likewise, if OverOps is monitoring packages that you don’t want or need it to, you can tell OverOps to exclude those packages. To do so, follow steps 1-3 above and enter the package names in the “Exclude the following packages” box. Press the “Close” button to save these changes.
Once you include a 3rd party library, OverOps treats it as if it were your code. That means it will not be called out as 3rd party code in your dashboard or error analyses.
This has been fairly abstract so far, so let’s go over an example for when you would want to include a 3rd party library in OverOps. Say that you want to use OverOps on your Spark cluster. In order to see errors coming from Spark, you will need to include the Spark library in the Advanced Settings. Here’s what you would do:
- Follow the steps here (or above) to navigate to your advanced settings screen.
- Click on the “+” mark located next to your secret key
- In the “Only show the following packages” box, enter all your personal/company code packages separated by semicolons (e.g. com.yourcompany; org.yourcompany; net.yourcompany;), then include the package for Spark (org.apache.spark).
- Click the “Close” button to save your changes. Make sure that you’ve included all your personal/company packages in the box first! Otherwise OverOps will not monitor them.
- Now OverOps will treat Spark as part of your code environment and will monitor errors that occur there. Note that Spark is a Scala library, so if you want to see your errors on the original Spark code, you’ll have to attach the source code. If you have questions about attaching source code for a 3rd party library, please contact us.