Data Engineer full time

Webureon HQ: Singapore, Singapore, Singapore Remote job Sep 30

Position Overview

We are seeking a highly skilled and motivated Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. The ideal candidate will have strong technical expertise, excellent problem-solving skills, and native or near-native English communication abilities to effectively collaborate with cross-functional teams across global environments.

Key Responsibilities

  • Design, develop, and maintain reliable ETL/ELT pipelines to ingest, process, and transform structured and unstructured data.
  • Build and manage scalable data infrastructure on cloud platforms (e.g., AWS, Azure, GCP) or on-premises systems.
  • Optimize database and data warehouse performance to ensure efficient data storage and retrieval.
  • Collaborate with data scientists, analysts, and stakeholders to translate business requirements into scalable data solutions.
  • Implement data quality, validation, and governance practices to ensure consistency and reliability.
  • Work with large-scale data technologies (e.g., Hadoop, Spark, Kafka, Snowflake, BigQuery, Redshift).
  • Automate workflows and optimize processes for faster data availability.
  • Stay updated with emerging tools, technologies, and best practices in data engineering.

Required Qualifications

  • Bachelor’s degree in Computer Science, Information Systems, Engineering, or related field (or equivalent work experience).
  • Proficiency in SQL and experience with relational and NoSQL databases.
  • Strong programming skills in Python, Java, or Scala.
  • Hands-on experience with data processing frameworks (e.g., Spark, Flink, Beam).
  • Experience with workflow orchestration tools (e.g., Airflow, Prefect, Luigi).
  • Familiarity with cloud data platforms (AWS Glue, GCP Dataflow, Azure Data Factory, etc.).
  • Knowledge of data modeling, data warehousing, and schema design.
  • Strong problem-solving and collaboration skills.
  • Native or near-native English fluency (strongly preferred).

Preferred Qualifications

  • Experience with real-time data streaming (Kafka, Kinesis, Pub/Sub).
  • Exposure to machine learning pipelines and MLOps practices.
  • Knowledge of DevOps tools for CI/CD in data pipelines.
  • Familiarity with data governance, compliance, and security frameworks (GDPR, HIPAA, etc.).

What We Offer

  • Competitive salary and benefits package.
  • Opportunities for professional growth and skill development.
  • A collaborative environment working on cutting-edge data solutions.
Requirements
Availability:
Full-time (40 hrs/wk)
Experience levels:
Beginner (1 - 3 yrs), Intermediate (3 - 5 yrs), Expert (5+ yrs)
Languages:
English

$60/hr