|Analyst4j - Features|
Analyst4j - Feature List
Eclipse Java Development Tool is a productive Java development IDE, its plugin model allows software vendors to concentrate on the value add they can provide rather than to build the system from scratch. Analyst4j is a plugin for Eclipse to find and analyze Java using software metrics.
Effective management of any process requires quantification, measurement, and modeling. Software metrics provide a quantitative basis for the development and validation of models of the software development process. Analyst4j automates the measurement process and provides an environment to find, analyze and visualize Java code quality, patterns and threats.
Applications developed using object oriented languages like Java are conceived as a collection objects which interact with one another to achieve the desired functionality. Interaction between objects creates a relationship between them. Object oriented metrics proposed by Chidamber & Kemerer and others , provide methods to measure the quality of object design, their relationship, dependency and other OO principles.
Analyst4j measures these metrics automatically, which forms the source for analysis.
Complexity of a system or its components is defined as the degree to which the system / components has a design or implementation that is difficult to understand and verify. In general program complexity can be classified into three categories logical complexity, psychological complexity and structural complexity.
Quantitative measurement of an operational system's maintainability is desirable both as an instantaneous measure and as a predictor of maintainability over time. Efforts to measure and track maintainability are intended to help reduce a system's tendency toward "code entropy" or degraded integrity, and to indicate when it becomes cheaper and/or less risky to rewrite the code than to change it.
Software Maintainability Metrics Models in Practice is the latest report from an ongoing, multi-year joint effort (involving the Software Engineering Test Laboratory of the University of Idaho, the Idaho National Engineering Laboratory, Hewlett-Packard, and other companies) to quantify maintainability via a Maintainability Index (MI) [Welker 95].
Measurement and use of the MI is a process technology, facilitated by simple tools, that in implementation becomes part of the overall development or maintenance process. These efforts also indicate that MI measurement applied during software development can help reduce lifecycle costs.
Analyst4j automates MI measurement process.
While well researched specialized metrics like the above provide specific insights about code quality, general code metrics have their use when used in combination with one another or with specialized metrics like OO Metrics or Complexity Metrics. For eg. comparing percentage of comments in a method with Cyclomatic Complexity would reveal a facts about undocumented complex methods which are surely hurdles for maintainability.
Analyst4j automates the following code metrics, across four level (Method, Class, File and Package).
Search is an activity almost inevitable when it comes to understanding IT. IDE's, Tool vendors and even search engine giants provide code related search to cater this need. While the current search features provide a first level information using text scan and cross reference, the field of specialized search using code quality attributes is mostly ignored.
Analyst4j provides a unique metrics based search facility which enables users to find code which are otherwise tough / impossible to identify. Though one would be well versed in a particular technology or programming language, focusing on artifacts of important is key to prioritize maintenance / development activities. Metrics based search helps in finding these artifacts of importance.
Using Analyst4j search facility you can find code based on their quality attribute, which are automatically resourced by Analyst4j powerful static analyzer.
The search results can be copied to clipboard to aide your document/reporting and the resources can be tagged to Eclipse "Tasks" view as kind of TODO / Work log.
Analyst4j provides about 20+ software metrics spanning four levels, and evaluating to provide more metrics in the versions to come. Its also comes with built in modifiable search queries such as "Blob Classes / Methods", "Spaghetti Code", "Swiss Knife Classes", "Complex Undocumented Methods", "Business Entity Classes" etc.
Automated metrics help to save manual counting effort and provide consistent results, it has its downside that it produces too much data too quickly leaving the user to mine through numbers. Graphs based on these data solves this problem and provide an abstract representation of the data collected.
Analyst4j provides a rich charting feature set to visualize the automated metrics.
The quality analysis perspective of Analyst4j, provides graphs for Comparison Analysis, Distribution Analysis and Search based Custom Analysis.
Comparing metrics helps us visualize a pattern and trend of the data and subsequently the code which exhibits these qualities.
For eg.Comparing Weighted Method Complexity (WMC2) with Inheritance depth of a class(DIT_CLS), gives us a fact that with increasing inheritance depth the complexity decreases. i.e, the complexity of classes are well distributed. In a system where there is very little use of inheritance the complexity of classes are concentrated and lead to more blob classes and swiss knife classes. Its is also easy to find extreme cases or deviating elements.
Visualizing distribution helps to categorize elements based on value range. Analyst4j provides a customizable categorization interface to quickly evaluate the distribution.
For e.g. Distribution of SEI Maintainability Index metrics can be categorized into "Very Poor Maintainability", "Poor Maintainability", "Good Maintainability" and "Excellent Maintainability" based on value range. Categorization helps to address issues specifically and on a reduced set of similar elements.
Create custom analysis based on search queries. Visualize using graphs to interpret the impact of the values.
When metrics based searches provide quick access to elements of interest, saving these queries serve as input for custom analysis. For e.g. To understand the antipatterns in methods / classes one can create/edit search queries based on metrics and include these queries in a custom analysis to visualize the results of the query as graphs.
Analyst4j comes with some default custom analysis for understanding Method Anitpatterns, Class Antipatterns and Object Oriented Design Validation. Users can edit these analysis and their inputs(search queries). However, the framework encourages users to create their own quality analysis.
Comparison, Distribution and Custom Analysis are visualized using graphs/charts. While the graphs are abstract representation of data(metrics), we often need to trace back to the resource whose data is being displayed. Analyst4j provides a feature to list the resources of a graph elements (categories/individual) and assists further to locate the resource in the project. The QA Case View provides an interface which is sensitive to selection in the graphs.
Preparing quick report! use copy to clipboard feature of QA Cases view to copy the contents listed in the table. Copied as tab separated values, matches perfect for pasting in spreadsheets like Microsoft Excel and word processors.
Analyst4j provides a feature to convert the analysis results to a workable task list, there by translating your metrics based discovery to improve identified areas. For eg. If you have just identified a list of undocumented Spaghetti code using Essential Complexity and Percentage of Comments, you can tag them as tasks to Eclipse "Task View" with a comment so that you could work on them.
One of the needs of quality analysis is preparing reports on the issues which is being analyzed. Maintainability and testability are issues often analyzed using software metrics. Analyst4j helps your reporting needs in the following ways.
Copy to clipboard (discussed earlier).
Save graph as image, can be used to compliment your presentation.
Export analysis as PDF report, when there is a need for portable authentic communication of your analysis.
Export analysis as Word file(RTF), when there is a need to edit and write your comments / interpretations of an analysis.
Thresholds or acceptable value range of software metrics are often debated, but it is not to be a reason to opt out. Application / Domain specific threshold can be used to evaluate code quality at the first level, however to find a perfect threshold/justification for a metric involves deeper analysis on the values specific to the project.
Analyst4j provides a feature to understand an object's measurement compared to their peers.
Comparing a "Class"'s complexity Vs its peers in the project gives a justification for its value/state. Also to make a judgment about the "Class", one has to understand the complexity of its "Methods" (Composition).
Comparing metrics for a package Vs all the package of a class gives its relative rank, which could be use to assign importance to that package.
Apart from other uses of "Analyst4j - Metrics View" it can be used to categorize metric distribution in "Distribution Analysis", as the metrics view gives the Minimum, Maximum and Average value of a metric in the project one could adjust the categorization to fit the project. This method has to be excised with caution as the categorization could become too much project specific and its genericness could be lost.
Analyst4j - Metrics View, is synchronized with "Package Explorer View" and "QA Scope View" and at all time displays the information of the currently selected object. The information/metrics vary depending upon the type of object (Project/Package/File/Class/Method).
|< Prev||Next >|