Full stack data developer

Apply now

Details

Competence Area
Fullstack
Location
Solna
Deadline
2024-08-15
Seniority level
Senior
Remote
20%
Reference
FG-5w573V

The Role

We are looking for an experienced Full Stack Data Developer for a client in the logistics industry

Start date: 2024-08-26\ End date: 2025-05-31\ \ The client has an ambition to become better in data driven decision making and to achieve this we have started several Lighthouses. In order to support the new First/Last Mile Logistics Lighthouse team, we are looking for an experienced Full Stack Data Developer who can participate in developing and maintaining data products and assets required to deliver business value to diverse usage archetypes.\ \ The consultant should have prior experience in working with data heavy digital solutions and the logistics industry. Be able to participate in discussions and workshops focusing on a wide range of aspects related to the modus operandi of the client as a company, and how the proposed changes fit within the existing system landscape.  \ \ Our new data platform is built on the Azure/Databricks tech stack, therefore prior experience in working with the Azure/Databricks tech stack, and experience in working with modern data platforms is important. Previous experience working in the logistics industry as well as with first & last mile delivery planning and route optimization is a plus. \ \ Where relevant, the client has decided that Microsoft’s Power BI platform will be used as the main platform for creating, sharing and presenting data. There may also be other scenarios involving full stack applications and/or data science workflows.\ \ More information:\ The Lighthouse is already live and is in the process of deciding the overall OKRs, documenting the as-is workflows; as well as outlining the target workflows, including the relevance of data flows. \ \ As the Fullstack Data Developer for the First/Last Mile Logistics Lighthouse, you will participate and collaborate in developing the architectural design in collaboration with the Lighthouse solution architect, as well as other architects. \  \ The position includes working closely with the business to identify issues and use data to propose solutions for effective decision making. It also includes designing or coding /building experiments to merge, manage, interrogate, and extract data to supply tailored reports to colleagues, customers, or the wider organization.\ \ The position includes the following responsibilities:\ • Develop and implement logical data models\ • Optimize data schema for read and write performance and storage\ • Design, develop and maintain data pipelines for data ingestion and transformation. Performance tune data pipelines\ • Ensure data quality and consistency in developed data products\ • Set up and manage infrastructure resources for building data pipelines including database, cloud storage resources, data pipeline run time environments\ • Implement data governance and regulatory compliance policies\ \ Work from home: The majority of the work will be done in Solna, Sweden. Some travels to other sites could occur. We allow hybrid work practices, and if agreed with the Lighthouse manager, then it will be possible from time to time to work from home/off-site.\ \ Requirements:\ We are looking for a Fullstack data developer who can design, develop and maintain data products and data assets. Previous experience working in the data domain is essential. You will work as part of a Scrum team and interact with business users to understand the requirements.\ \ Experience:\ • At least 3 years’ experience working with data product solutions and concepts. \ • Proficiency in data modelling techniques (dimensional model, normalisation)\ • Proficiency in database systems, data warehousing and distributed computing concepts\ • Proficiency in programming languages e.g. Python, SQL and data processing framework/libraries (Apache Spark, Pandas)\ • Proficiency with data pipeline orchestration tools (Azure Data factory, Databricks, Azure Logic Apps, Apache Airflow, DBT)\ • Proficiency of working with Code Repositories (GIT) and CI/CD Pipelines\ • Proficiency in integrations serving and consuming Rest API’s and Kafka\ • Proficiency in containerization (Docker, Kubernetes/OpenShift, Azure Functions)\ • Experience of deploying infra resource with YAML and Terraform\ • Familiarity with cloud platforms (one of Azure, GCP, AWS)\ \ Merits: \ • Preferably a background within software engineering, with interest in data & analytics\ • Experience in requirement processes for UI or UX \ • Experience in agile way of working (SCRUM)\ \ Language:\ English is required. Fluency in Swedish is a big plus, and basic understanding of Danish and/or Norwegian is helpful.  \ \ To apply: CV in Word with requested experience

Required skills

fullstack
python
SQL
Apache
Spark
Pandas
Azure
Databricks
Apache
DBT
GIT
CI/CD
REST
Kafka
Docker
Kubernetes
OpenSHift
YAML
Terraform
AWS
GCP
UI
UX
Scrum

Languages

English
Swedish

Teamstorlek

Please fill in all the required fields

That’s all we need for now.
We will keep you updated.