Oakton
The Rocks
Oakton is an Australian consulting and technology firm founded in 1988. We bring together industry experience, business insight and specialist technology solutions to help our clients make the right decisions.

Apply for this position on Nvoi

By signing-up you agree to Nvoi’s Platform Agreement, Privacy Policy and Employment Agreement.


Already signed-up?Login
Apply for this position on Nvoi

Current opportunitites with Oakton

Data Engineer
Created 25 May 2018
Melbourne | per hour
Top skills desired
Nosql
Cloudera
Power BI
Tableau
Cloud
Data Warehouse
spark
Scala
Big Data
Data Ingestion
Data pipelines
Hadoop
Project description

We are looking for a Big Data Engineer to play a pivotal role in designing and implementing big data platforms across various clients. Candidate must be able to perform well within a project team and interact well with clients to ensure client satisfaction and achievable delivery

Responsibilities

  • Design and implement data pipelines to ingest, store and process big data
  • Provide direction for big data reference architectures and big data infrastructure
  • Work closely with internal and external project teams to support project and operational activities
  • Facilitate solution design workshops
  • Produce solution documentation that clearly communicates requirements, configuration changes and solution specifications

Requirements

  • Experience implementing large scale data processing systems
  • Experience implementing large scale-data loading, manipulation and processing using a range of big data technologies including NoSQL, Hadoop, Hive, Spark, Kafka
  • Extensive experience in data profiling, source-target mappings, ELT development, SQL optimisation, testing and implementation
  • Strong programming skills in at least one of Python, Scala, or SQL
  • Strong understanding of core Hadoop concepts including Yarn, MapReduce, Hive, Pig, Sqoop, HDFS
  • Experience in visualization tools such as Qlik, Tableau or Power BI
  • Experience translating business requirements into data-set requirements
  • Working knowledge of dimensional modelling and modern data warehouse architecture
  • Strong experience building and optimizing data pipelines
  • Strong communication and presentation skills
  • Excellent analytical and problem solving skills who can thing outside the box
  • Able to handle fast-paced environment and juggle multiple tasks and deliverables
  • A team player who aid in growth and knowledge sharing
  • (Highly desirable)Agile Scrum framework