Kairos Blue AB jobb i Lund

Hitta lediga jobb hos Kairos Blue AB i Lund. Välj att läsa mer om ett jobb eller gå vidare och ansök jobbet i Lund.

GCP Data Engineer

Systemutvecklare/Programmerare
Läs mer Jan 23
We are looking for a Data Engineering Specialist for the following responsibilities:
Design, develop and operate production-grade data pipelines for data ingestion and processing
Build analytics products, considering both technology and business requirements
Work closely with Product Owner and business stakeholders to ensure business value realization as part of a cross-functional agile team with Product Owner, Data Scientists/Analysts/Stewards and Data/ML/Software/DevOps Engineers.

Key responsibilities:
Extract data from external source systems
Perform data profiling to validate data quality
ETL development and tools
Prepare data and access management for advanced analytics and self-service
Additionally:
Agile Methodologies
Self-driven, action- and goal-oriented, good communication to technical and non-technical stakeholders



Tools and techniques:
Google Cloud Platform.
Data Engineering: developing data pipelines/ETL for data lakes, data warehouses and data marts.
Complex for processing and analysis using Bigquery
Programming: Python and preferably also JavaScript. Other languages meriting.
DevOps, DataOps, CI/CD, Infrastructure as code/config

Ansök nu

Sr Data engineer

Systemutvecklare/Programmerare
Läs mer Okt 19
We are looking for a Data Engineering Specialist for the following responsibilities:
Design, develop and operate production-grade data pipelines for data ingestion and processing
Build analytics products, considering both technology and business requirements
Work closely with Product Owner and business stakeholders to ensure business value realization as part of a cross-functional agile team with Product Owner, Data Scientists/Analysts/Stewards and Data/ML/Software/DevOps Engineers.

Key responsibilities:
Extract data from external source systems
Perform data profiling to validate data quality
ETL development and tools
Prepare data and access management for advanced analytics and self-service
Additionally:
Agile Methodologies
Self-driven, action- and goal-oriented, good communication to technical and non-technical stakeholders



Tools and techniques:
Google Cloud Platform. / Azure
Data Engineering: developing data pipelines/ETL for data lakes, data warehouses and data marts.
Complex for processing and analysis using Bigquery
Programming: Python and preferably also JavaScript. Other languages meriting.
DevOps, DataOps, CI/CD, Infrastructure as code/config
Meriting
Azure Database (SSMS) as data source
Azure portal (assigning roles and resources for Data scientists or other consumers, setting up Logic Apps, creating/managing data lakes)
Power Platform (Power BI, PowerApps, Power Automate)

Ansök nu

Data Engineer

Dataingenjör
Läs mer Jan 19
We are looking for a GCP Data Engineering Specialist for the following responsibilities:
· Design, develop and operate production-grade data pipelines for data ingestion and processing, enabling downstream pipelines and analytics products, considering both technology and business requirements
· Design data compliance and access management of personally identifiable sensitive datasets in bigquery
· Work with and contribute to a DevOps setup (continuous integration, and deployment) on GCP
· Set up monitoring and alerting for data ops, quality and availability
· Work closely with Product Owner and business stakeholders to ensure business value realization as part of a cross-functional agile team with Product Owner, Data Scientists/Analysts/Stewards and Data/ML/Software/DevOps Engineers.
Key responsibilities:
Extract data from external source systems
Perform data profiling to validate data quality
ETL development using Googles Big Query and Matillion
Prepare data and access management for advanced analytics and self-service
Create ingestions patterns that allow integration with external data sources
Mapping tables for additional dimensions

Additionally:
Flexible with agile mindset, be prepared to handle ad hoc requests from stakeholders, due to some things are implemented on the fly Familiar with Agile, Scrum
Self-driven, action- and goal-oriented, good communication to technical and non-technical stakeholders
Tools and techniques:
Google Cloud Platform.
Data Engineering: developing data pipelines/ETL for data lakes, data warehouses and data marts.
Complex for processing and analysis using Bigquery
Programming: Python and preferably also JavaScript. Other languages meriting.
DevOps, DataOps, CI/CD, Infrastructure as code/config
Apache Spark / Beam / Dataflow
Apache Airflow
Meriting
Azure Database (SSMS) as data source
Azure portal (assigning roles and resources for Data scientists or other consumers, setting up Logic Apps, creating/managing data lakes)
Power Platform (Power BI, PowerApps, Power Automate)
What 3 things from the box above are most important?
· GCP Data Engineering/Warehousing
·
· Programming

Ansök nu

IT Cloud Architect

IT-arkitekt/Lösningsarkitekt
Läs mer Dec 13
We are looking for a GCP Data Engineering Specialist for the following responsibilities:
· Design, develop and operate production-grade data pipelines for data ingestion and processing, enabling downstream pipelines and analytics products, considering both technology and business requirements
· Design data compliance and access management of personally identifiable sensitive datasets in bigquery
· Work with and contribute to a DevOps setup (continuous integration, and deployment) on GCP
· Set up monitoring and alerting for data ops, quality and availability
· Work closely with Product Owner and business stakeholders to ensure business value realization as part of a cross-functional agile team with Product Owner, Data Scientists/Analysts/Stewards and Data/ML/Software/DevOps Engineers.

Key responsibilities:
Technical architectExtract data from external source systems
Perform data profiling to validate data quality
ETL development using Googles Big Query and supporting tools
Prepare data and access management for advanced analytics and self-service
Create ingestions patterns that allow integration with external data sources
Mapping tables for additional dimensions

Additionally:
Flexible with agile mindset, be prepared to handle ad hoc requests from stakeholders, due to some things are implemented on the fly Familiar with Agile, Scrum
Self-driven, action- and goal-oriented, good communication to technical and non-technical stakeholders

Tools and techniques:
Google Cloud Platform.
Data Engineering: developing data pipelines/ETL for data lakes, data warehouses and data marts.
Complex for processing and analysis using Bigquery
Programming: Python and preferably also JavaScript. Other languages meriting.
DevOps, DataOps, CI/CD, Infrastructure as code/config
Apache Spark / Beam / Dataflow
Apache Airflow

Meriting
Azure Database (SSMS) as data source
Azure portal (assigning roles and resources for Data scientists or other consumers, setting up Logic Apps, creating/managing data lakes)
Power Platform (Power BI, PowerApps, Power Automate)

What 3 things from the box above are most important?
· GCP Data Engineering/Warehousing
·
· Programming

Ansök nu

Sr. Embedded SW Engineer

Systemutvecklare/Programmerare
Läs mer Feb 5
Expeeruenced in constrained embedded environments /iot with latest technology mobile IoT devices . sensing and communication technologies in resource constrained embedded device environments, and requires extensive experience in embedded systems, RTOS, nam


Technical Experience & Skills
Required
Embedded SW with bare metal development on ARM HW
Real-time OS: NuttX or FreeRTOS
Programming language: C, C++ and Python (or other scripting languages)
Linux/Ubuntu experience
Git
Machine Learning workflow experience


Preferred
Cellular networks 3GPP/LTE/5G
Short range networks BLE, 802.15.4
Security protocols: TLS, DTLS, OSCORE
Build systems experience
Systems design experience
Gitlab/Github
HW experience – basic schematics reading, soldering, wiring

Ansök nu

Cloud System Architect

Systemarkitekt
Läs mer Sep 25
#jobbjustnu
For our project with a leading customer , need a System admin and devops architect managing secure systems in cloud environment.


You will
You will work with delivering virtualized solutions for cloud usage at enterprise grade.
deploy container-based services using Docker, working with Docker images, Docker Hub and Docker-registries and Kubernetes.
integration of microservices into the solution
providing continuous integration and deployment functions for the cloud environment



To be successful in the role you must have
Expertise in the relevant technologies (Kubernetes, Docker Open stack - private cloud)
Microservices architectures experience
Cloud infrastructures and issue resolution
Hands-on experience in developing and deploying microservices solutions on Cloud infrastructure
Experience with infrastructure automation tools (Ansible, Chef, Puppet, or the like)
Automations capabilioty with Python
Used build tools preferably Jenkins, Artifactory, Gerrit, Git, Bazel, Cmake
Excellent communication and stakeholder management skills
Relevant certifications in Kubernetes , Linux , Open stack arcihitectrure

Ansök nu