Big Data Engineer

Remote - Barcelona, Catalonia, …

The job ad is older than 1 month and may no longer exist.

Job's description

Wallbox is a global company, dedicated to changing the way the world uses energy. We do this by creating smart charging systems that combine innovative technology with outstanding design and manage the communication between vehicle, grid, building, and charger.

Founded in 2015, with headquarters in Barcelona and selling already in over 80+ countries, our mission is to facilitate the adoption of electric vehicles today to make more sustainable use of energy tomorrow. Our talent has no borders - we welcome over 700 Wallboxers from over 45 nationalities in Europe, Asia, and the Americas!

We’re now the first Spanish unicorn listed on the NYSE ($WBX) and have been recently named amongst LinkedIn’s top 3 Spanish startups.

Wallbox is committed to become a data driven company, not only for improving our decision making processes but also for helping us drive better our product development and incorporate data into our existing products or even create new business models.

We started our data journey just one year ago and it has already been an amazing drive! We've built a passionate and vibrant team, organized in a hub & spoke model, with roles such as data analysts, data engineers and analytics engineers

Also we've been building a modern data stack, with technologies like Snowflake, Airflow, Fivetran, DBT, Tableau and Databricks.

Today our data lake is ingesting more than 1M events per hour coming from more than 350 sensors in our chargers. And this is just the beginning, as our goal is to reach one million online chargers by 2025.


If you love real time processing, consume large amounts of data and build data lakes with modern stack this is your position. IoT is scaling a lot and we are managing this challenge with new Big Data Engineers:

  • 3+ years of experience working as part of a data team; preferably as a data engineer.
  • Working experience implementing ETL in data lake infrastructures.
  • Be fluent with one or more common data-related programming languages (Python, Scala, Java or similar).
  • Working experience with cloud providers, especially in AWS.
  • Experience with streaming platforms like Confluent Kafka or Aws Kinesis.
  • Experience with distributing compute projects, specially with Apache Spark.
  • Be familiar with software development best practices and their applications to analytics (version control, testing, CI/CD, automation).
  • Experience working with Data Scientists and Analysts.
  • English is a must.


  • Ingest streaming data sources via an event bus like Confluent Kafka or Kinesis.
  • Deal with schema evolution with solutions such as a schema registry (like Confluent schema registry or AWS Glue schema registry) and data formats like Avro, Parquet or Delta.
  • Design, develop, and deploy Data Lakes in AWS.
  • Process real time data in a time series database.
  • Process data based on event driven architecture and extend our code basis according to the hub standards.
  • Develop transformation jobs with a distributed computing framework like Apache Spark or SQL-based transformation framework like DBT.
  • Automate data pipelines using tools like Apache Airflow.
  • Apply software engineering best practices like version control and continuous integration to the analytics code base, performing deployments based on Gitlab and guarantee a good development life cycle.
  • Coach analysts and data scientists on software engineering best practices.

    Nice to have:
  • Experience processing large volumes of data based on data lakes.
  • Experience with Databricks or Confluent.
  • Experience with time series databases like InfluxDB or TimeScale
  • Experience with task orchestration tools, especially with Airflow.
  • Experience with GitLab CI or Github actions.
  • Familiarity with infrastructure and automation tools (Terraform, Cloudformation, or similar).

Soft skills:

  • You are able to work-out effective solutions under uncertain or ambiguous circumstances.
  • You’re always willing to learn something new and embrace a healthy debate.
  • Quality in mind. You can easily detect whether a data result is good or bad in terms of quality and you understand that building good code with strong testing is key to growth and sustainability.
  • Strong analytical, problem-solving skills and critical mindset.
  • You have experience designing and implementing features in collaboration with product owners, reporting analysts/data analysts, and business partners within an Agile / Scrum methodology.


  • 100% company paid individual medical & dental insurance, after six months
  • Remote
  • Attractive compensation package
  • Flexible working hours
  • Friday afternoons off
  • Working from remote can be considered
  • Opportunity to advance your payroll (under request)
  • Unlimited coffee & beverages
  • Language classes (English & Spanish)
  • Sports channel, which offers online classes until our gym is opened
  • Monthly “All Hands” & other team events
  • Brand new canteen with a variety of breakfast and lunch dishes, everyday, for a discounted price
  • Brand new offices in Zona Franca
  • Over 20 different nationalities
  • No suits! Unless it’s Carnival or Halloween

*At Wallbox, we’re committed to equal employment opportunity regardless of race, colour, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender, gender identity or expression, or veteran status. We strive to be a more equal opportunity workplace.

Apply to position

© Copyright 2022 remotemachinelearning.com. All Rights Reserved.