Skip to main content

Page loading completed.

Data Engineer

04/03/2025
04/04/2025
Permanent - Full Time
Sydney office
Shared Services

We are seeking a motivated Data Engineer to join our dynamic data team. This mid-level role is ideal for a candidate with a solid foundation in data engineering or someone with a strong background in data analysis who is looking to transition into data engineering. You will be responsible for building reports, working with semantic models (e.g. star schemas), and developing data pipelines to support our analytics and business intelligence efforts. This role offers an excellent opportunity for a data analyst looking to expand their skill set and transition into data engineering or for an experienced data engineer to make a big impact in a fast-paced environment. We’re a small team and we want someone with a “jump into anything” mentality.

Job Description

Key Accountabilities

The Data Engineer – Reporting and Analytics role is a key position within the IT department, working within the data team to build reports, run semantic models, and develop pipelines to support NobleOak’s data analytics and industry intelligence efforts. The role will provide integral analytic support to a small team, as well as contribute to company-wide innovation and automation initiatives, identify potential risks, implement analytic strategies, coordinate projects with members of the Data team, and report status and outcomes to the Senior Data Engineer and Data Lead.

 

  • Data Integration – Design and implement robust, scalable, and efficient data pipelines to collect, transform, and distribute data that supports NobleOak’s growing business intelligence systems and services. Integrate data from various sources, ensuring data consistency, reliability, and quality.
  • Governance and Reporting - Ensure data integrity, governance, quality, and security within the data warehouse and data lake solutions. Implement data validation and quality checks, and ensure compliance with data governance policies and standards.
  • Optimisation and AutomationOptimise data systems for performance, efficiency, and output. Adress performance bottlenecks and implement improvements to enhance data processing efficiency. Design and implement automation processes for data generation, processing, and improved production.
  • Stakeholder Communication – Work closely with other data team members, leaders, analysts, and other stakeholders to understand data needs and requirements. Provide data support for various business functions and communicate risks and outcomes as they arise.

Key Responsibilities

  • Reporting and Analytics – Develop, maintain, and optimise Power BI reports and dashboards to provide actionable insights to various business units. Work with semantic models to ensure data is accurately represented and easily accessible for reporting and analysis. Work with data scientists to deploy and maintain machine learning algorithms, metrics, and other analytics.
  • Process Automation – Streamline and automate business operations, including regulatory processes and reporting. Assist in developing ETL processes to ingest data from various sources such as SQL Server, Salesforce, and other source systems into Azure Data Services.
  • Stakeholder Collaboration – Collaborate with senior data engineers to create and maintain data pipelines and data processing workflows using Azure Fabric/Databricks. Work jointly with data analyst and scientists to ensure to distribution of accurate and reliable predictions and analytics. Communicate effectively with various business units to understand their data needs and deliver appropriate solutions. Understand DevOps processes and contribute in a meaningful way to those processes.
  • Team Support – Support the data team in the implementation of an end-to-end analytics and data platform (Databricks/Microsoft Fabric). Provide support to divisional analysts in automating the distribution of insights and predictions. Assist in upskilling more junior or less experience data practitioners with the technology and techniques with which you have experience.
  • Documentation – Maintain documentation for data pipelines, models, and processes to ensure clarity and maintainability.
  • Culture Champion – Like all NobleOak roles, this role must be a strong ambassador and champion of the NobleOak high-performance culture and brand values. Always displaying positivity, energy and integrity, taking responsibility for actions and building on a positive and engaging diverse team dynamic. Our values and culture are outlined on the back page is this document.

Desired Skills and Experience

Capabilities

  • Hands-on experience in developing and maintaining Power BI reports and dashboards
  • Strong understanding of data modelling concepts and experience working with semantic models
  • Experience in SQL and T-SQL for data querying and manipulation
  • Experience in Python and Pyspark for analytics and data wrangling.
  • Basic knowledge of ETL processes and data pipeline development
  • Familiarity with Azure Data Services (e.g., Azure Data Factory, Azure Synapse Analytics, Azure Data Lake Storage)
  • Basic programming skills in languages outside of SQL, including such as Python, PySpark, Power Query M, Spark (Scala)
  • Experience with data governance frameworks and relevant software (MS Purview) is a plus
  • Understanding of CI/CD tools and practices (Azure DevOps) is a plus.
  • Ability to communicate effectively with various business units and understand their data needs

Experience & Qualifications

  • Bachelors degree in Computer Science, Data Science, Information Technology, or a related field
  • Certifications in Power BI, Azure, or related technologies
  • Experience in the financial services or technology sectors
  • Familiarity with Databricks/MS Fabric and data engineering concepts