We’re data-heavy and data-hungry, so you have to LOVE data. Much of your job will be working with large datasets at scale, creating complex queries, complex reports, and helping us extract more value from our dataset as it grows.
You’ll also be meshing together data from external services and APIs, so we can continue to offer our customers a more complete picture of how their site and pages are performing.
Our app is broken up across several microservices and powered by Google Cloud / Kubernetes.
To apply for this job ‘please include “I love data” in the first sentence of your message or cover letter or your application will be disqualified’. After that, please provide 3 specific examples of developing and maintaining complex backend API systems and architectures.
- Work on our Rails-based JSON APIs, auth services, data pipelines and databases
- Help maintain code quality, organization, and automatization
- Analyzing the existing data for helpful predictions for our customers (to help them grow their website)
- Help build complex data queries to mashup large datasets, in a performant way
- Work with third party data scientists and machine learning experts to extract meaning from our data
- Improving our crawler and helping increase the data we’re gathering from pages - more on-page information (word count etc)
- Integrate more 3rd party services to enhance and augment our available data
- Understand and influence the vision and overall strategy
- Ruby/Rails Expert 5+ years minimum
- Postgres/Database expert - you should be very comfortable with low-level database engineering, complex queries, and large datasets.
- Expert with API Development 5+ years minimum
- Experience with big data analysis
- Machine learning is a big bonus
- TimescaleDB experience is a bonus
Nice to haves
Experience with BigData databases (any/all of the following)
- Message Queues
- Machine Learning
- Data science
- Ruby 2.5.x
- Rails 5 (API only)
- VueJS 2.6.x
- Python/R (stats microservice)
- Node (page speed microservice)