. Hands-on experience with open-source ETL, and data pipeline orchestration tools such as Apache Airflow and Nifi. Experience with large scale/Big Data technologies, such as Hadoop, Spark, Hive, Impala, PrestoDb, Kafka. Experience with workflow orchestration tools like Apache Airflow. Experience with containerisation using Docker and deployment on Kubernetes. Experience with NoSQL and graph databases. Unix More ❯
cloud platforms (e.g., AWS , Azure , GCP ). Experience deploying ML models or managing AI/ML workflows in production. Working knowledge of big data technologies like Spark , Hive , or Hadoop . Familiarity with MLOps tools (e.g., MLflow , Kubeflow , DataRobot ). Education Bachelor’s degree in Computer Science , Software Engineering , or a related technical field — or equivalent practical experience. Why More ❯
cloud platforms (e.g., AWS , Azure , GCP ). Experience deploying ML models or managing AI/ML workflows in production. Working knowledge of big data technologies like Spark , Hive , or Hadoop . Familiarity with MLOps tools (e.g., MLflow , Kubeflow , DataRobot ). Education Bachelor’s degree in Computer Science , Software Engineering , or a related technical field — or equivalent practical experience. Why More ❯
tools like Apache NiFi, Talend, or custom scripts. Familiarity with ELT (Extract, Load, Transform) processes is a plus. Big Data Technologies : Familiarity with big data frameworks such as ApacheHadoop and Apache Spark, including experience with distributed computing and data processing. Cloud Platforms: Proficient in using cloud platforms (e.g., AWS, Google Cloud Platform, Microsoft Azure) for data storage, processing More ❯
Microsoft Azure, or Google Cloud Platform (GCP) Strong proficiency in SQL and experience with relational databases such as MySQL, PostgreSQL, or Oracle Experience with big data technologies such as Hadoop, Spark, or Hive Familiarity with data warehousing and ETL tools such as Amazon Redshift, Google BigQuery, or Apache Airflow Proficiency in Python and at least one other programming language More ❯
Microsoft Azure, or Google Cloud Platform (GCP) Strong proficiency in SQL and experience with relational databases such as MySQL, PostgreSQL, or Oracle Experience with big data technologies such as Hadoop, Spark, or Hive Familiarity with data warehousing and ETL tools such as Amazon Redshift, Google BigQuery, or Apache Airflow Proficiency in Python and at least one other programming language More ❯
S3, Lambda, BigQuery, or Databricks. Solid understanding of ETL processes , data modeling, and data warehousing. Familiarity with SQL and relational databases. Knowledge of big data technologies , such as Spark, Hadoop, or Kafka, is a plus. Strong problem-solving skills and the ability to work in a collaborative team environment. Excellent verbal and written communication skills. Bachelor's degree in More ❯
and statistical packages. Strong analytical, problem-solving, and critical thinking skills. 8.Experience with social media analytics and understanding of user behaviour. 9.Familiarity with big data technologies, such as ApacheHadoop, Apache Spark, or Apache Kafka. 10.Knowledge of AWS machine learning services, such as Amazon SageMaker and Amazon Comprehend. 11.Experience with data governance and security best practices in AWS. 12Excellent More ❯
and statistical packages. Strong analytical, problem-solving, and critical thinking skills. 8.Experience with social media analytics and understanding of user behaviour. 9.Familiarity with big data technologies, such as ApacheHadoop, Apache Spark, or Apache Kafka. 10.Knowledge of AWS machine learning services, such as Amazon SageMaker and Amazon Comprehend. 11.Experience with data governance and security best practices in AWS. 12Excellent More ❯
and statistical packages. Strong analytical, problem-solving, and critical thinking skills. 8.Experience with social media analytics and understanding of user behaviour. 9.Familiarity with big data technologies, such as ApacheHadoop, Apache Spark, or Apache Kafka. 10.Knowledge of AWS machine learning services, such as Amazon SageMaker and Amazon Comprehend. 11.Experience with data governance and security best practices in AWS. 12Excellent More ❯
and statistical packages. Strong analytical, problem-solving, and critical thinking skills. 8.Experience with social media analytics and understanding of user behaviour. 9.Familiarity with big data technologies, such as ApacheHadoop, Apache Spark, or Apache Kafka. 10.Knowledge of AWS machine learning services, such as Amazon SageMaker and Amazon Comprehend. 11.Experience with data governance and security best practices in AWS. 12Excellent More ❯
and statistical packages. Strong analytical, problem-solving, and critical thinking skills. 8.Experience with social media analytics and understanding of user behaviour. 9.Familiarity with big data technologies, such as ApacheHadoop, Apache Spark, or Apache Kafka. 10.Knowledge of AWS machine learning services, such as Amazon SageMaker and Amazon Comprehend. 11.Experience with data governance and security best practices in AWS. 12Excellent More ❯
and statistical packages. Strong analytical, problem-solving, and critical thinking skills. 8.Experience with social media analytics and understanding of user behaviour. 9.Familiarity with big data technologies, such as ApacheHadoop, Apache Spark, or Apache Kafka. 10.Knowledge of AWS machine learning services, such as Amazon SageMaker and Amazon Comprehend. 11.Experience with data governance and security best practices in AWS. 12Excellent More ❯
and statistical packages. Strong analytical, problem-solving, and critical thinking skills. 8.Experience with social media analytics and understanding of user behaviour. 9.Familiarity with big data technologies, such as ApacheHadoop, Apache Spark, or Apache Kafka. 10.Knowledge of AWS machine learning services, such as Amazon SageMaker and Amazon Comprehend. 11.Experience with data governance and security best practices in AWS. 12Excellent More ❯
and statistical packages. Strong analytical, problem-solving, and critical thinking skills. 8.Experience with social media analytics and understanding of user behaviour. 9.Familiarity with big data technologies, such as ApacheHadoop, Apache Spark, or Apache Kafka. 10.Knowledge of AWS machine learning services, such as Amazon SageMaker and Amazon Comprehend. 11.Experience with data governance and security best practices in AWS. 12Excellent More ❯
and statistical packages. Strong analytical, problem-solving, and critical thinking skills. 8.Experience with social media analytics and understanding of user behaviour. 9.Familiarity with big data technologies, such as ApacheHadoop, Apache Spark, or Apache Kafka. 10.Knowledge of AWS machine learning services, such as Amazon SageMaker and Amazon Comprehend. 11.Experience with data governance and security best practices in AWS. 12Excellent More ❯
and statistical packages. Strong analytical, problem-solving, and critical thinking skills. 8.Experience with social media analytics and understanding of user behaviour. 9.Familiarity with big data technologies, such as ApacheHadoop, Apache Spark, or Apache Kafka. 10.Knowledge of AWS machine learning services, such as Amazon SageMaker and Amazon Comprehend. 11.Experience with data governance and security best practices in AWS. 12Excellent More ❯
London, England, United Kingdom Hybrid / WFH Options
Luupli
and statistical packages. Strong analytical, problem-solving, and critical thinking skills. 8.Experience with social media analytics and understanding of user behaviour. 9.Familiarity with big data technologies, such as ApacheHadoop, Apache Spark, or Apache Kafka. 10.Knowledge of AWS machine learning services, such as Amazon SageMaker and Amazon Comprehend. 11.Experience with data governance and security best practices in AWS. 12Excellent More ❯
8. Strong analytical, problem-solving, and critical thinking skills. 9. Experience with social media analytics and understanding of user behaviour. 10. Familiarity with big data technologies, such as ApacheHadoop, Apache Spark, or Apache Kafka. 11. Knowledge of AWS machine learning services, such as Amazon SageMaker and Amazon Comprehend. 12. Experience with data governance and security best practices in More ❯
professional experience Preferred Skills: Experience working within the public sector. Knowledge of cloud platforms (e.g., IBM Cloud, AWS, Azure). Familiarity with big data processing frameworks (e.g., Apache Spark, Hadoop). Understanding of data warehousing concepts and experience with tools like IBM Cognos or Tableau. Certifications:While not required, the following certifications would be highly beneficial: Experience working within … the public sector. Knowledge of cloud platforms (e.g., IBM Cloud, AWS, Azure). Familiarity with big data processing frameworks (e.g., Apache Spark, Hadoop). Understanding of data warehousing concepts and experience with tools like IBM Cognos or Tableau. ABOUT BUSINESS UNIT IBM Consulting is IBM's consulting and global professional services business, with market leading capabilities in business and More ❯
time data pipelines for processing large-scale data. Experience with ETL processes for data ingestion and processing. Proficiency in Python and SQL. Experience with big data technologies like ApacheHadoop and Apache Spark. Familiarity with real-time data processing frameworks such as Apache Kafka or Flink. MLOps & Deployment: Experience deploying and maintaining large-scale ML inference pipelines into production More ❯
environment. Preferred Qualifications: AWS Certified Data Analytics - Specialty or AWS Certified Solutions Architect - Associate. Experience with Airflow for workflow orchestration. Exposure to big data frameworks such as Apache Spark, Hadoop, or Presto. Hands-on experience with machine learning pipelines and AI/ML data engineering on AWS. Benefits: Competitive salary and performance-based bonus structure. Join a rapidly expanding More ❯
5+ years of experience in data engineering roles with progressively increasing responsibility Proven experience designing and implementing complex data pipelines at scale Strong knowledge of distributed computing frameworks (Spark, Hadoop ecosystem) Experience with cloud-based data platforms (AWS, Azure, GCP) Proficiency in data orchestration tools (Airflow, Prefect, Dagster, or similar) Solid programming skills in Python, Scala, or Java Experience More ❯
years of experience • Working experience in Palantir Foundry platform is must • Experience designing and implementing data analytics solutions on enterprise data platforms and distributed computing (Spark/Hive/Hadoop preferred). • Proven track record of understanding and transforming customer requirements into a best-fit design and architecture. • Demonstrated experience in end-to-end data management, data modelling, and More ❯
platforms (e.g., AWS, Azure, Google Cloud). Knowledge of machine learning techniques and frameworks. Experience with version control systems (e.g., Git). Familiarity with big data technologies (e.g., Snowflake, Hadoop, Spark) #J-18808-Ljbffr More ❯