An overview of how we test, rate, and review different software platforms
Software Connect is a software matching service which helps businesses find the right solution for their needs. To provide the best matches, we personally review software and score it so you know what’s the best and can avoid the rest.
Software Connect reviews software based on the following categories:
We may add or omit review categories based on the evaluated software type.
Each category is weighted based on the product type and relevant standards.
The aforementioned categories are scored out of 10, with 10 being the highest and 1 the lowest. A product has to be exceptional to receive a perfect score, so any score above a 7 indicates good standing in that category.
These category scores are added together to generate an average. This total is then weighted against other factors. For this reason, some incredibly robust, enterprise-level software might have a lower overall score than a simple, forever free program intended for startups. Different markets, different considerations, different scores.
Any existing user reviews are also considered in the editorial process, as day-to-day users are more likely to notice potential long-term benefits or shortcomings our team might miss. We do our best to note when praise or criticism originates from these user reviews instead of our editorial team.
Reviews are performed under the following conditions:
Remote demos are arranged with software developers so they can showcase their software to us and allow us to ask direct questions. Since we allow the developers to lead the demonstration, we do not utilize a set script. Instead, we have a list of questions we ask during each demo. Just as with general scoring criteria, these questions are sometimes modified based on the product’s intended audience.
In the event a live demo cannot be arranged, a pre-recorded demo is one way we can become familiar with the software. Sometimes the demos cover the relevant topics anyway. We do our best to be clear when a review is the result of a pre-recorded demo instead of a live one, and how that might limit our assessment.
Finally, we access the software directly by using a trial account. We go through different workflows and try to see how intuitive the software is, what appears to be missing, and if there are any other limitations. Our ability to fully review the software may be limited if certain features are locked behind a paywall, which we do our best to indicate in the final review.
Ideally, we have at least one demo provided by the developer and access to a trial account in order to get a complete understanding of the software.
Our editorial team’s primary objective is to simplify the software search process for businesses. These reviews help potential buyers see at a glance which software might suit their needs.
Our reviews are performed as independently as possible by our editorial team. We review software from partners and also for any high-performing developers with well-known products in a space. The paid partners of our service may receive some priority when it comes to scheduling demos or accessing sandbox accounts. However, a paid partnership does not influence the final editorial score.
We can perform reviews at any time, which allow us to try out brand new products or go back to an existing software to see if there have been any changes worth noting in an existing review.
Regular updates ensure we have the most up-to-date information on software. Reevaluating software also allows us to keep our Editor’s Picks lists fresh for our users. If a software provider has made major changes to their product, such as launching a cloud-based version or adding new integrations, we are happy to arrange another demo and update our review.