We operate one of the largest big-data environments in the Baltics and personalize offers for 40 million consumers daily for the clients in the Baltics, USA, Central, South America. Our services cover a full range of data services from big data management & engineering to data-driven customer value management solutions (Up-sell, Cross-sell, Retention, and Migration).
We are looking for a Data Architect who will lead and develop the newest Data Engineering solutions for our clients.
About the team:
We do advanced solutions, and we’re expanding. Data Mission team specializes in projects related to data preparation, transformation, and aggregation in a large data scale.
Our main challenge is to solve real business problems for our clients, leveraging Hadoop stack and cloud solutions such as AWS and Microsoft Azure, in addition to core data warehousing tools and other big data-related technologies.
We use common data architecture practices to architect, design, and develop data/analytic platforms (e.g., data lakes, lake house, data warehouses) that are used to produce analytical products like reports, dashboards, ML models, etc.
We operate one of the largest big-data environments in the Baltics, where we personalize offers for 40 million consumers on a daily basis for the clients in the Baltics, USA, Central, South America.
We are talking about petabytes of data.
New client and new Data Engineering solutions by designing and developing data/analytic platforms (e.g., data lakes, lake house, data warehouses) that are used to produce analytical products, like reports, dashboards, ML models, etc.
Your main responsibilities:
- Work with a team of data engineers to create blueprints for modern data management systems and deliver against the overall technology data strategy.
- Document all data architecture components of an overall solution architecture on projects.
- Help to guide Big Data solutions for our clients, including requirements analysis, platform selection, technical architecture design, development, testing, and deployment.
- Work hands-on on various data engineering tasks: build and optimize data flows/pipelines, create or renew dimensional data models.
- Research, analyze, and recommend technical approaches for solving complex and challenging development and integration problems.
- Fast growing mindset to learn about new Big Data technologies and continuously optimize products and services.
We expect you to have:
- Experience in Big Data pipeline development, including batch and streaming data processing.
- Hands-on experience with a few Big Data components like: Hadoop, Hive, Spark, Spark streaming, Flink, Hbase, Kafka, Airflow, Elasticsearch, Redis and others.
- Familiar with one of the Cloud technologies such as AWS S3, AWS Glue, AWS RedShift, Azure Blob storage, Azure Data Factory, Azure Synapse, SnowFlake, Data Bricks Delta Lake.
- Team working, analytical mind, be optimistic, willing to accept challenges and quick response against incidents.
- Familiarity with agile software development practices and drive to ship quickly.
Bonus points if you:
- Advanced Experience in big data product development is a plus, including data warehouse, query platform, streaming, etc.
- Expert level Data Warehouse Architecture and Design skill (Multidimensional data modeling experience).
- Salary for this position (before the taxes) from 4959 EUR to 6700 EUR Gross.
- Participation in the company’s stock options program.
- Personal learning budget (2000 Eur/year).
- Every second Friday, free time is dedicated to learning.
- Ownership in Data solutions creation and development.
- All the support you need from our experienced team to become an even better professional.
- Quarterly team-building events.
- And the most important thing – super smart & great team!
Are challenges acceptable?
Send an email with your LinkedIn profile or CV to firstname.lastname@example.org.