Netquest is among the most advanced and innovative digital data collection specialists worldwide in the market research and analytics industry. We currently have more than 1,700,000 consumers willing to share their behavioural data and give us their opinion in 21 countries, and we are now expanding in 18 new countries.
Our ambition is to be the most reliable, flexible and powerful data source in the market, and contribute to the future of market research through automation and innovation.
Netquest is looking for a talented Business Intelligence Engineer · ETL Specialist to join our team. As part of a growing Business Intelligence team you will work to grow and expand our data platform to solve our current and future challenges. You will be working with multiple data sources contributing to transform the data into a business knowledge.
The main responsibilities of this team are:
- Setup and maintain data platform at the software infrastructure level as well as at the data level.
- Design, build, and maintain processes and components of a streaming data/ETL pipeline to support real-time analytics.
- Manage deployment of data solution across the development, integration and production environments.
- Collaborate with data scientists to design and develop processes to data science initiatives on a common data platform.
Your main responsibilities will be:
- Analyse the data requirements and define the needed data sources, formats, extraction frequency.
- Develop extraction components from different data sources (databases, api, files, etc) with programming languages such as Java or Python.
- Design complex queries in MySQL or PostgreSQL, in order to optimize performance of extraction and analysis of big datasets.
- Design and implement multi-dimensional model in a distributed database like AWS Redshift.
- Develop transformation jobs with an ETL tool like Talend or a distributed computing framework like Spark.
- Automation: use of tools like AWS Data Pipeline, AWS Batch or AWS Lambda and bash scripts for the automated management of the data pipeline.
- Distribute streams of events with AWS Kinesis
You´re a great candidate if…
- You have a Bachelor’s Degree in Computer Science or a related technical field.
- You’re an ace at data modeling, accustomed to designing and implementing complex architectures with a constant eye on their future evolution, while keeping into account the needs of multiple users.
- You can code SQL in your sleep, optimizing queries for performance, scalability, and ease of maintenance.
- You have experience integrating data from multiple sources including DBs and APIs.
- You get excited by seeing your jobs run like clockwork.
- You are knowledgeable with various ETL techniques and frameworks.Experience as a back-end developer (Java or Python or Scala).
- You have been using Amazon WebServices, Azure or similar platforms.
- You are able to work-out effective solutions under uncertain or ambiguous circumstances.
- You’re always willing to learn something new and embrace a healthy debate.
- You enjoy working in a team environment.
It’s a bonus if you have:
- Experience with AWS Redshift or other MPPs.
- Experience with Talend Data integration.
- Experience with streaming platforms like Kafka or Aws Kinesis.
- Experience with Apache Spark or Hadoop.
- Experience with orchestration tools like Airflow or Luigi.
- Some experience with Machine Learning and predictive algorithms.
- Some experience working with agile methodologies (Scrum/Kanban).