The client is a company which provides tools for A/B testing, and also introduced feature testing through rollouts. They provided SDKs to change the same application through their variations and analyze impressions and conversions while used by the customers. They have a reporting tool where they have statistical data along with recommendations on how to increase your profit. It also provides a robust framework to control audiences and variant variables.
The customer is providing an SDK for it’s A/B testing functionality on multiple languages and multiple platforms. The most challenging part is the uniformity across all languages. Manual testing of more than 700 scenarios in each SDK (language) was a very challenging task, especially when there is a commit to merge into an open-source repo, a large audience contributes over there.
We helped in providing a test automation solution where we can verify every single commit is aligned with core functionality or any change in core functionality doesn’t break SDKs. We also helped in making sure that every major and minor release of SDKs goes stable and bugs free.
We started with an HTTP wrapper for every SDK / library that conforms protocol to our testing scenarios and provides the same interface for all SDKs via REST endpoints. We set up Jenkins with multiple slaves to accommodate multiple requests at the same time. On every event which was configured on Jenkins for every repo, builds were triggered and running whole scenarios. Along with running whole scenarios, we set up a pipeline to run all of the SDKs and check our testing scenarios in case modifying any. We used Docker containerization in the form of pre-setup instructions in the docker file.
There are some requirements where we wanted to monitor our SDK behavior in terms of hardware usage and memory leakages, it is an essential part to detect memory leakages and weird hardware behavior to run your application for a long time. So we have set up cron jobs, which triggers terraform to run the plan and execute that plan, once that plan is done, instance is up, everything is provided using user data, and all the results are uploaded to S3.