Data Architect


We are looking for a Data Architect who will lead and develop the newest Data Engineering solutions for our clients. 

We operate one of the largest big-data environments in the Baltics and personalize offers for 40 million consumers daily for the clients in the Baltics, USA, Central, South America, and the Caribbean. Our services cover a full range of data services from big data management & engineering to data-driven customer value management solutions (Up-sell, Cross-sell, Retention, and Migration).

About the team:

You will join the Data Lake team which specializes in projects related to data preparation, transformation, and aggregation on a large data scale.

What is waiting for you?

  • New client and new Data Engineering solutions. 
  • Designing, and developing data/analytic platforms (e.g., data lakes, data warehouses,) that are used to produce analytic products, like reports, dashboards, ML models, etc.

What you will do:

  • Work with a team of engineers to create blueprints for modern data management systems, and deliver against the overall technology data strategy.
  • Document all data architecture components of an overall solution architecture on projects.
  • Help guide Big Data solutions for our clients, including requirements analysis, platform selection, technical architecture design, development, testing, and deployment.
  • Work hands-on on various data engineering tasks: build and optimize data flows/pipelines, create or renew dimensional data models.
  • Research, analyze, and recommend technical approaches for solving difficult and challenging development and integration problems.
  • Keep learning Big Data technologies and continuously optimize products and services.

We expect you to have:

  • Experience in Big Data pipeline development including batch and streaming data processing.
  • Hands-on experience with a few Big Data components like: Hadoop, Hive, Spark, Spark streaming, Flink, Hbase, Kafka, Airflow, Elasticsearch, Redis and others.
  • Familiar with one of Cloud technologies such as: AWS S3, AWS Glue, AWS RedShift, Azure Blob storage, Azure Data Factory, Azure Synapse, SnowFlake, Data Bricks Delta Lake.   
  • Team working, analytical mind, be optimistic, willing to accept challenges and quick response against incidents.
  • Familiarity with agile software development practices and drive to ship quickly.

Bonus points if you:

  • Advanced Experience in big data product development is a plus, including data warehouse, query platform, streaming, etc.
  • Expert level Data Warehouse Architecture and Design skill (Multidimensional data modeling experience).

We promise:

  • Salary for this position –  from 4800 EUR to 6100 EUR Gross. 
  • Ownership in Data solutions creation and development.
  • All the support that you need from our experienced team to become an even better professional.
  • Participation in the company’s stock options program.
  • Personal learning budget (2000 Eur/year).
  • Every second Friday free time is dedicated to learning.
  • Quarterly team-building events.
  • And the most important thing – super smart & great team!

We carefully review every application and contact selected candidates.

You can contact us cv@exacaster.com

Submit your application

    First Name and Last Name (required)

    Email (required)

    Position you are applying for (required)

    Upload resume file (.pdf, .doc or .txt, file size up to 2mb):


    Tell us more about yourself and the job you're after:

    I agree with Exacaster using my name and email address for the purpose of contacting me.

    Your privacy is important to us. This form collects your name and email so that we can answer your request. Check our privacy policy for the full story on how we protect and manage your submitted data!

    Open positions