How We Score Software
Our methodology is simple: we listen to real users, not vendors. Here's exactly how we do it.
The Problem We're Solving
Enterprise review platforms like G2, Capterra, and TrustRadius have a fundamental conflict of interest: they make money from the software vendors they review. Vendors pay for "sponsored placements," premium profiles, and lead generation — creating an incentive to keep scores high. The result? Nearly every product on these platforms has a 4.0+ rating, making them useless for actual decision-making.
We took a different approach. Instead of collecting reviews on our platform (which can be gamed), we analyze what real users say in places where there's no incentive to be anything but honest: Reddit, accounting forums, HR communities, and small business groups.
Data Sources
We collect and analyze discussions from:
- Reddit communities: r/smallbusiness, r/accounting, r/humanresources, r/startups, r/entrepreneur, r/bookkeeping, and 40+ other relevant subreddits
- Accounting forums: Professional accounting communities where CPAs and bookkeepers discuss client tools
- HR professional groups: SHRM communities, HR-focused Slack groups, and LinkedIn discussion threads
- Small business networks: Entrepreneur forums, local business Facebook groups, and startup communities
- Industry-specific forums: Restaurant owner groups, freelancer communities, e-commerce forums
We specifically exclude discussions that appear promotional, come from new/throwaway accounts, or contain affiliate links. We also weight recent discussions (last 12 months) more heavily than older ones, since software changes quickly.
Our Scoring Process
Step 1: Collection
We continuously monitor our data sources for discussions about business software. For each product category, we typically analyze between 2,000 and 10,000 individual discussions. Each mention is tagged with context: is the user recommending, warning against, or simply mentioning the product?
Step 2: Filtering
Not all mentions are equal. We filter for quality signals: account age (minimum 30 days), stated usage duration (we prioritize users who've been using the software for 6+ months), specificity of feedback (detailed comments weighted higher than "it's good"), and community reputation of the commenter.
Step 3: Sentiment Analysis
We use AI-powered sentiment analysis calibrated specifically for software discussions. This goes beyond positive/negative — we score across dimensions like ease of use, customer support quality, value for money, reliability, and feature completeness. Each dimension contributes to the overall Authentic Score.
Step 4: Scoring
The Authentic Score (1.0 - 5.0) is a weighted composite of all sentiment dimensions. Recent mentions count more than old ones. Detailed reviews count more than brief mentions. Verified long-term users count more than new users. The score is recalculated monthly as new discussions are analyzed.
How We Make Money
Transparency matters. We make money through affiliate partnerships with some of the software companies we review. When you click a "Visit" link and sign up, we may earn a commission. This is clearly labeled on every page.
Critical point: affiliate relationships never influence our scores. Our rankings are based purely on user sentiment data. We partner with top-ranked products because they're genuinely good — not the other way around. Products we don't have affiliate relationships with receive the same scoring treatment.
Accuracy & Limitations
No review methodology is perfect. Forum discussions skew toward people with strong opinions (positive or negative), which may not represent the silent majority. We mitigate this by requiring large sample sizes and weighting balanced, detailed reviews higher than emotional outbursts.
Our scores also reflect the micro-business perspective (1-50 employees). A product that scores poorly here might be excellent for enterprises, and vice versa. We're specifically serving small business owners, freelancers, and startup founders.
Questions?
If you have questions about our methodology or want to report an issue with our data, contact us. We take accuracy seriously and will investigate any concerns.