Project Goal
The goal of this project was to automate the creation of A/B tests within Optimizely full-stack, a popular experimentation platform. This involved creating feature flags, setting up variations, enabling rules for different environments, updating an Airtable record with the relevant information and creating a JIRA ticket. It streamlined a manual process, saving significant time – click of a button in Airtable VS 15 minutes of tedious work – and reducing the chance of human error.
How it was built
The system is built using Python, leveraging libraries like requests
for API calls, pyairtable
for Airtable interaction, and json
for data handling. The main logic is within main.py
, where a Google Cloud Function is triggered by an HTTP request. This function then calls upon other modules to perform specific tasks such as:
- Airtable Interactions:
airtable_functions.py
retrieves a unique dimension number and updates the corresponding record in Airtable. - Optimizely API Calls: The core logic in
main.py
creates feature flags, variations, and sets up experiments using the Optimizely API. - JIRA API interactions:
jira_functions.py
helps to generate a JIRA ticket (now disabled) - Utility Functions:
utils.py
handles string cleaning, title length checks, and token retrieval based on user data.
The function dynamically builds API payloads based on the input data and performs multiple API calls to Optimizely and Airtable. It also includes error handling to log any issues that arise during the process.
Technologies used
- Python
- Google Cloud Functions
- Optimizely API
- Airtable API
- JIRA API
requests
librarypyairtable
libraryjson
library