Share this job
Apply now »

 

 

 

 

 

 

Title:  Data Engineer Manager

Job ID:  80241
Country:  Philippines
City:  Taguig City
Professional area:  Information Technology
Contract type:  Permanent
Professional level:  Experienced
Location: 

Taguig City, 00, PH, 1634

 

We’re JTI, Japan Tobacco International, and we believe in freedom

 
We think that the possibilities are limitless when you’re free to choose. We’ve spent the last 20 years innovating and creating new and better products for our consumers to choose from. It’s how we’ve grown to be present in 130 countries, and how we’ve grown from 40 to 4,000+ employees in the Philippines since 2009.

 

But our business isn’t just business, our business is our people. Their talent. Their potential. We believe that when they’re free to be themselves, to grow, travel and develop, amazing things can happen for our business. That’s why our employees, from around the world, choose to be a part of JTI. It’s why 9 out of 10 would recommend us to a friend, and why we’ve been recognized as INVESTORS IN PEOPLE in the Philippines

 

It’s the perfect moment for you to #JoinTheIdea. We’re opening our Global Business Service center in the heart of BGC Manila and looking for more than 300 bright minds to join a global multinational with an exciting start-up vibe.

 

 

Department: Information Technology
Location: Taguig, Philippines
Reporting to: Data Platform and Operations Director

 

POSITION PURPOSE:

DATA ENGINEER MANAGER role is to develop and maintain efficient and scalable Data projects in the JTI’s Central Data Lake Platform. The incumbent will ensure the efficient and qualitative development support for global relevant applications and projects under the Data Platform and Operations team governance. The incumbent will analyze, organize and combine raw data from different sources and build the corresponding data systems and pipelines to ensure that business needs and objectives are met with the data delivered. He/She will explore new ways to enhance data quality and reliability with the aim to provide most optimal data assets for Data Analyst and Data Scientists to consume. The incumbent has to ensure that reporting consultants will timely assess and implement global changes and projects. He/she also coordinates the solution delivery by reviewing and monitoring the development efforts, project plans and assignments. 

  • Implement modern data solutions with Azure Data Factory, Data Lake, Data Bricks, Azure SQL and Synapse
  • Analyze the impact of new and evolving business requirements and explore the feasibility of how to accommodate them.
  • Work with Data & Analytics, IT-BTS and Functions together with external development teams in order to ensure that the configuration and customization meet business requirements and performance baseline.
  • Follow up with external developers on problems & incidents related to Central Data Lake Platform processes and failures, ensure that the cause is identified and fixed.
  • Perform UAT and performance & regression testing of new and existing developments to ensure accurate and efficient operation using best practices.
  • Support production systems including issue resolution.
  • Develop enhancements, changes based upon functional or technical specifications.
  • Monitor system and projects pipelines performance


What will you do?

Data analysis and Data quality support for applications, projects and enhancements which includes:

  • Ensure alignment and adherence to governance policy, global definitions, mapping and business validation rules.
  • Actively manage and maintain the Commercial SKU attributes and values in SAP Material Master and making sure that all active SKUs are linked to the correct Commercial SKU in SAP (validating automated mapping from PLM and mapping manually un-mapped SKUs).
  • Responsible for providing feasibility assessments on changes to the Commercial SKU definition or take empowered decisions when the request is not feasible based on available data.
  • Provide oversight and help to facilitate requirements gathering process for Commercial SKU creation and assignment based on SKU data.
  • Provide data quality requirements for SKU and Commercial SKU relevant attributes/characteristics (Rules, KPIs, etc.). 
  • Deliver data quality improvements
  • Data preparation for SKU and Commercial SKU data quality issues Follow Data Quality cleansing activities

 

Develop and maintenance for the Data applications, projects and enhancements under the Data Platform and Operations team governance, which includes:

  • Design and develop the data pipelines for the corresponding projects ensuring that are highly available, scalable, reliable, secure, and cost-effective
  • Collaborate with GTC, EA and IT Security teams to receive the green light for the chosen topology; ensure that new development will fulfil the security and technical requirements
  • Design and document the application topology, technology recommendation and delivery methodology;
  • Work with the project manager and external service providers to estimate the resources, budget and plan. Highlight the known risks. 
  • Writes unit/integration tests, contributes to engineering wiki, and documents work.
  • Deliver the development up to the UAT phase and coordinate bug fixing during UAT
  • Roll outs and deployments to Production support. Ensuring the cutover is defined and followed up.
  • Ensure Data pipelines maintenance and performance testing.
  • Responsible for the CI/CD of developments 

 

Central Data Lake Platform optimization and support:

  • Provide, review and confirm technical solutions aligned with the Data Platform Delivery Team Manager
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Ensure quick adoption of new features delivered by Microsoft as part of new data platform services
  • Create and maintain optimal data pipeline architecture ensuring production data is always accurate and available for key stakeholders and business processes that depend on it
  • Monitor and maintain developed data pipelines proactively to ensure high service availability.
  • Investigation and resolution of incidents and change requests, incidents and problems via Service Now
  • Coordinate outsourced resources for required resolution
  • Ensure that technical specs documentation is regularly maintained and updated 

 

Manage internal/external development team activities for projects involved including but not limited to the following tasks:

  • Code/Peer review
  • Communicate the requirements with relevant resources
  • Verify technical analysis and estimations given by resources
  • Manage development activities planning, distribute workload among teams to maximize performance
  • Coordinate efficient internal and external resources utilization
  • Onboarding of new internal/external consultants for new projects 
  • Implement proper unit testing activities and migration between environments;                                       
  • Execute the lessons learnt analysis together with the team at the end of each project.   
  • If applies ensure that externals always provide timesheets on time. Validate and approve TS.

 

Who are we looking for?

  • College or University Degree
  • Work Experience:
    • 10+ years of experience in Data and Analytics from modelling and reporting standpoint
    • 5+ exposure to IT architecture and solutioning for data flows among system and applications.
    • 3+ years delivering solutions on the Microsoft Azure platform with a special emphasis on data solutions and services
    • DEVOPS, CI/CD and Data project management 
  • Additional Skills Required:
    • Strong analytical skills and troubleshooting skills
    • Excellent team leadership and communication skills.
    • Should have good knowledge in testing and validating Analytical solutions 
    • Azure Data Lakes experience plus DataBricks and Data Factory
    • Very good understanding of Azure PaaS management and resource monitoring 
    • CI/CD experience and DEVOPS
    • Very good Database skills (Azure SQL and Synapse), scripting and data modeling
    • Very good written and spoken English.

 


Job Segment: Data Management, Data Analyst, Engineering Manager, Data Modeler, Database, Data, Engineering, Technology

Apply now »