No more applications are being accepted for this job
- Implement ETL systems that are operationally stable, efficient, and automated.
- This includes technical solutions that are scalable, aligned with the enterprise architecture and can adapt to changes in business.
- Monitor and optimize batch jobs, including automation and scheduling.
- Consistently review and enhance performance with key focus on ease of maintenance & jobs reusability.
- Work closely with internal/external teams devise requirements of data integrations, specific to Data Warehouse/Data Marts implementation
- Provide data quality and volume metric measurement across organization data.
- Implement best practices to data change management, documentation, and data protocols.
- Growing the capabilities of the data platforms, solving new data problems and challenges.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Work with stakeholders including the Business Analysts and Data Scientists teams to assist with data-related technical issues and support their data infrastructure needs.
- Support and implement initiatives for continual engineering improvements.
- Review business and technical requirements and ensure the data integration platform meets requirements.
- Apply industry best practices for ETL design and development.
- Produce written deliverables for technical design, system testing and implementation activities.
- Conduct System Testing – execute job flows, investigate system defects, resolve defects and document results.
- Work with DBAs, application specialists and technical services to tune performance of the ETL system to meet performance standards and SLAs.
- Assist in the development, documentation and application of best practices and procedures.
- At least 3-4 years of experience developing Data warehouse and Business Intelligence solutions.
- Hands-on experience in Microsoft SSIS and MS-SQL specifically on designing and implementing ETL packages.
- Strong practical experience in data warehousing, data modelling/architecture and SQL.
- Knowledge of cloud data engineering tools such as Python, Pyspark, Airflow, BigQuery etc.
- Database administration methodologies and techniques.
- Experience with performance tuning & query optimization of data warehouse systems.
- Ability to work in a fast-paced agile development environment.
- Ability to troubleshoot and solve complex technical problems.
- Strong analytical, critical thinking and problem-solving skills when navigating uncertainties and/ or complex situations.
- Able to work independently with a proactive attitude.
- Strong knowledge in SQL writing skills.
- Familiarity with ETL tools like SSIS, Apache Airflow, Oracle Data Integrator, GitHub
- Understanding of data integration best practices, leading industry applications and features such as master data management, entity resolution, data quality, assessment, metadata management, etc.
- Exposure in data warehouse architecture, data analysis/data profiling of source systems
- Able to function well in a fast-paced and adaptive environment.
- Ability to work independently and communicate across multiple levels (Product owners, Commercial Stakeholders, Team members)
- Work experience in Agile methodology is a plus
- A very good team player & good communication skills.
- Good interpersonal skills and the ability to collaborate with other technical teams, project management, client management and business analysts.
Senior Data Engineer - Kuala Lumpur, Malaysia - The Edge Partnership
Description
Key responsibilities
Role requirements
Skills & Competencies