Data Engineer - Singapore

Mid / Senior

|

Remote

Meytier Premier Employer

Working there

About This Workplace

Meytier Partner

 As a big data engineer you will be responsible to

  • Develop big data solutions for near real-time stream processing, as well as batch processing on the Big Data platform.
  • Analyse problems and engineer highly flexible solutions
  • Set up and run BigData development Frameworks like Hive, Sqoop, Streaming mechanisms, Pig, Mahout, Scala, SPARK and others.
  • Working experience on Big Data services on the cloud, preferred Azure (ADF, ADLS, Blob Storage, Azure SQL WH, etc)
  • Work with business domain experts, data scientists and application developers to identify data relevant for analysis and develop the Big Data solution 
  • Coordinate effectively with team members in the project, customer and with business partners
  • Adapt and learn new technologies surrounding BigData eco systems
  • Take initiative to run the project and gel in the start-up environment. 

 Required Experience, Skills & Competencies

  • Minimum 5 years of Professional experience with 2 years of Hadoop project experience.
  • Experience in Big Data technologies like HDFS, Hadoop, Hive, Pig, Sqoop, Flume, Spark etc.
  • Experience working on the cloud environment, preferred on Azure
  • Must Have core Java experience or advance java experience.
  • Experience in developing and managing scalable Hadoop cluster environments and other scalable supportable infrastructure.
  • Familiarity with data warehousing concepts, distributed systems, data pipelines and ETL.
  • Good communication (written and oral) and interpersonal skills. 
  • Extremely analytical with strong business sense.
  • Experience in NOSQL technologies like Hbase, Cassandra, MongoDB (Good to have)


© 2024 Meytier - All Rights Reserved.
   Privacy Policy    Terms Of Use