OCN: A Website Builder For Car Dealers
QA Audit of a Website Building Platform for Car Dealers

Industry
Automotive
Business Type
Startup
Services rendered
QA audit
Team size
2 QA engineers
Summary
SysGears conducted a QA audit of a website builder that enables car dealers to create professional, easy-to-navigate websites using ready-made templates. We performed end-to-end testing, including user acceptance, usability, smoke, and regression testing, to validate the platform’s functionality and stability under real user scenarios. We specifically focused on checking cross-browser and cross-platform compatibility to ensure the solution works consistently across devices.
We identified approximately 60 issues, most of which affected usability and cross-platform consistency. Apart from reports, we developed structured test documentation, comprehensive test cases, and end-to-end scenario diagrams to ensure full functional coverage. The client also received actionable improvement recommendations to ensure the platform’s consistent quality as it evolves.
About Our Client
OCN is a subscription and site building platform designed specifically for car traders. The platform enables non-technical users to effortlessly create a website with pre-designed templates. OCN offers PRO and Lite subscription plans with different configuration packages. Check the platform in action here: OMRG Motors.
Why an Independent QA Audit Was Essential For the Successful SaaS Platform Launch
The client was preparing to launch their product and thus sought third-party QA engineers to identify defects that the in-house team may have overlooked. Since OCN was designed as a SaaS solution, it was critical to ensure it is stable, secure, and able to deliver an uninterrupted user experience across different devices, operating systems, and browsers. SaaS products require even more rigorous testing than other app types, because even minor issues can affect many users at once, often resulting in customer churn and financial losses for businesses. This is why an independent QA audit was a must for OCN.
SysGears quickly assembled a team of two senior QA engineers, who started working immediately after a Service Agreement was signed.
QA Audit Goals and Requirements
- Verify the system’s compliance with software development best practices and check its stable operation across the intended platforms (iOS, Android, macOS, Windows) and the most popular browsers.
- Conduct diverse types of functional and non-functional testing, including user acceptance, smoke, and usability testing.
- Perform manual testing on real devices to accurately simulate actual user behavior.
- Conduct regression testing once improvements are introduced to verify everything works correctly before a final release.
- Create a structured document that illustrates test coverage and details pending and completed tasks.
- Organize daily standups to discuss identified defects and issues.
- Create a detailed test report and provide recommendations for improvement.
Our Approach
SysGears ensured the engagement was fast, efficient, and flexible. Thus, the project was split into two stages:
Stage 1: QA Audit
Two senior QA engineers conducted a comprehensive QA audit over 7 days, identifying defects and usability issues around key user flows. Afterwards, the client’s development team fixed the reported problems.
Stage 2: Regression Testing
One senior QA engineer conducted regression testing over 10 days to verify that bug fixes didn’t affect another functionality.
What Has Been Done
User Acceptance Testing
We conducted UAT to verify that the software meets business requirements. Specifically, we tested the platform’s main flows, namely Sign Up, Log In, Select Subscription, Create a Website, Publish a Website, Delete a Website, Profile Settings, Upgrade / Downgrade the Plan, as well as the landing page.
Cross-Platform and Cross-Browser Testing
We tested the platform’s user flows across different browsers (Google Chrome, Edge, Safari, Firefox) and platforms (Android, iOS, MacOS, Windows).
Usability Testing
As the platform was designed for non-technical users, we specifically validated how intuitively the users could interact with the system and complete their tasks. Special attention was dedicated to navigation logic, UI consistency, and clarity of user interactions.
Test Design Techniques Applied
Equivalence Partitioning and Boundary Value Analysis
We used these techniques to ensure that forms behave correctly at boundary values, maximize test coverage, and minimize the number of test cases.
State Transitions Diagram
We validated how the platform behaves when users move between different stages and actions, identifying broken, illogical, or incomplete flows.
Cause-Effect Analysis
Since the platform included multiple decision-based flows, it was important to verify that different combinations of user actions and conditions (cause) produce the correct system responses (effect).
Error Guessing
We applied this technique to anticipate system errors under conditions where they typically occur, identifying issues that might not be detected through standard tests.
Pairwise Testing
This technique helped us efficiently test different combinations of browsers and operating systems while reducing the number of test cases.
Key Findings
The audit revealed 60 bugs: 21 with high and medium priority, 27 with low priority, and 12 as improvement tickets.
The most critical issues included the following:
Mobile UX and responsiveness issues: several workflows contained illogical navigation paths that could confuse users during key actions, such as Create a Website.
Navigation and redirect logic issues: some user actions triggered incorrect or illogical redirects between pages, and some expected flows (e.g., after sign-up) were illogical or missing.
Profile management issues: certain profile management workflows didn’t work correctly, e.g., a success message appeared instead of an error message, users could use the current password while setting a new one, and so on.
Poor input validation in user forms: for example, some forms allowed users to enter an e-mail address without a top-level domain.
Authentication issues: social login integrations were not functioning properly.
Cross-platform compatibility issues: the platform demonstrated inconsistent behavior across certain device-browser combinations.
Our QA audit helped clients identify potential user experience issues and prepare the platform for the successful release.
Client’s Feedback
“Our cooperation with SysGears was a really positive experience. They communicated regularly, provided well-structured reports, and met the deadlines. Their QA audit revealed hidden user experience errors, which could result in customer churn when left unattended. Due to SysGears, we managed to detect and fix problems before launch, thus securing the success of our product.”
Osayi Omoregie
Founder of OMRG Motors
All Technologies Used
TestRail
Qase
Postman
JMeter
Boost your business with custom software
Tell us about your business needs and we’ll suggest a solution
Thank you!
We have received your request and will get back to you within 1 business day.