Data Engineer (100%)
Permanent
techies
Innovation Hub Portugal
Become part of a team to build the next generation digital universal bank, focusing on sustainability, from scratch. radicant is a dynamic start-up aiming to democratise access to personalised and sustainable financial services. Using technology and with the collaboration amongst our community, we help our customers to achieve their individual financial objectives over the course of their lifetime - 24/7, whilst giving their money a higher purpose.
Are you a digitally affine and a purpose-driven individual with a start-up mindset seeking to join a passionate team to create long-lasting value for people and the planet?
We are looking for a Data Engineer to join our Data & ML Platform team. Our team is responsible for efficient data synchronisation, structuring and processing. Our team combines the frontend and backend services and integrations to provide a data rich platform. Our APIs power most services in the radicant eco-system and machine learning components as well as provide the technical foundation for our growth.
Tech Stack: GCP, Datafusion, Dataflow, Kubernetes, Airflow (Composer), Spring Boot, Java, Scala, Python, dbt, BigQuery
Are you a digitally affine and a purpose-driven individual with a start-up mindset seeking to join a passionate team to create long-lasting value for people and the planet?
We are looking for a Data Engineer to join our Data & ML Platform team. Our team is responsible for efficient data synchronisation, structuring and processing. Our team combines the frontend and backend services and integrations to provide a data rich platform. Our APIs power most services in the radicant eco-system and machine learning components as well as provide the technical foundation for our growth.
Tech Stack: GCP, Datafusion, Dataflow, Kubernetes, Airflow (Composer), Spring Boot, Java, Scala, Python, dbt, BigQuery
"The future depends on what you do today"
Christoph Schwarz I CTO
What you can achieve with us
- Build batch and event-driven data pipelines that empower data scientists' exploratory work as well as the training and deployment of machine learning models
- Build infrastructure and components following best practices such as CI/CD and infrastructure as code
- Work on GCP leveraging managed services and open source whenever suitable
- Move and process large amounts of data with the help of tools like Dask, Spark, BigQuery, Batch, Google Dataflow, Google Cloud Data Fusion and others
- Understand technologies to identify the best solutions
- Act as an expert technical resource in cross-functional interactions with Data Scientists, Researchers, and the Software Engineers
What you bring to the table
- 5+ years of experience developing, monitoring and debugging production systems
- Experience running containerised workloads
- Experience or willingness to work with Python, Java and JavaScript
- Good architectural understanding of event driven architectures, workflow engines, database and data warehouse systems to help us scale along with our growing user base
- Proven expertise with web-based, service-oriented applications architectures and designs (REST, HTTP-Stack)
- Follow common practices: version control (git), issue tracking, unit testing and agile development processes
- Have a pragmatic can-do attitude and delivery-focused mindset: you can handle trade-offs between short-term goals and long-term tech debt
- Are happy both executing on your expertise as well as learning new skills
What we offer
- A unique opportunity to build a tech company from greenfield to market leader
- A fast-growing and international team of highly qualified people
- Possibility and tools to work remotely
- Most importantly: We move fast and have fun while doing it
We are looking for passionate people who believe in our mission and feel inspired to grow with us, rather than someone who checks the boxes but isn’t invested. Please reach out and show us what you’ve got.