Title: Data Platform Engineer Manager
Bucharest, B, RO, 00000
At JTI we celebrate differences, and everyone truly belongs. 46,000 people from all over the world are continuously building their unique success story with us. 83% of employees feel happy working at JTI.
To make a difference with us, all you need to do is bring your human best.
What will your story be? Apply now!
Learn more: jti.com
Role: Permanent
Location: Bucharest
Data Platform Engineer Manager
About the position:
Data Platform Engineer Manager provides guidance on the design and management of data for data applications, formulate best practices and development standards, organize processes for data management, governance and evolution. Build processes and tools to maintain high data availability, quality and maintainability. He/She will develop and maintain architectural roadmap for data products and data services plus ensuring alignment with the business and Enterprise Architecture strategies and standards.
The incumbent will determine technical solutions that further business goals and align with corporate technology strategies, keeping in mind performance, reliability, scalability, usability, security, flexibility, and cost.
He/She will explore new ways to implement/automate data quality and reliability processes with the aim to reduce development times and to provide most optimal and cost effective data assets for Data Analyst and Data Scientists to consume.
What will you do - Responsibilities:
Central Data Lake Platform re-enginner towards a new Data Mesh paradigm under the Data Platform and Operations team governance, which includes:
• Intensively collaborate in the implementation of the Data Mesh concept by provisioning azure resources that will be needed by the data domains or the data federation modules
• Design and develop artifacts that will reduce the deployment time of new data domains inside the Data Mesh concept • Participate on new data initiatives: Purview, Trident, in order to facilitate their implementation
• Design the data pipelines for the corresponding projects ensuring that are highly available, scalable, reliable, secure, and cost-effective
• Collaborate with GTC, EA and IT Security teams to receive the green light for the chosen topology; ensure that new development will fulfil the security and technical requirements
• Design and document the application topology, technology recommendation and delivery methodology
• Work with the project manager and external service providers to estimate the resources, budget and plan. Highlight the known risks.
• Design unit/integration tests, contributes to engineering wiki, and documents work
• Supervise the development up to the UAT phase and coordinate bug fixing during UAT
• Roll outs and deployments to Production support. Ensuring the cutover is defined and followed up
• Ensure Data pipelines maintenance and performance testing
• Responsible for the CI/CD of developments
Central Data Lake Platform Governance monitoring, automation, optimization and support:
• Institute patterns that support data ingestion, data movement, transformations, aggregations, and more
• Create optimal data pipeline architecture ensuring that the data has the best suitable format from the point of view of consumption and cost in every stage of the datalake • Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed
• Reformulating existing data frameworks to optimize their functioning
• Remaining up-to-date with industry standards and technological advancements that will improve the operation and quality of the data platform
• Design and implement monitoring processes of the whole Central Data Lake Platform, including: performance, cost, availability indicators at the project/resource level
• Provide, review and confirm technical solutions aligned with the Data Platform Delivery Team Manager
• Identify and design internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
DEVOPS service platform architecture, streamline software delivery, and ensure the continuous improvement of systems and processes:
• Infrastructure Management: Design, implement, and manage infrastructure using Infrastructure as Code (IaC) tools
• CI/CD Pipelines: Develop and maintain Continuous Integration and Continuous Delivery (CI/CD) pipelines to automate software and code releases
• Collaboration: Facilitate communication and collaboration between development, operations, and other stakeholders to improve productivity and efficiency • Monitoring and Logging: Implement monitoring, logging, alerts, and dashboards to track the performance and health of applications and infrastructure.
• Automation: Write and maintain scripts to automate tasks and DevOps processes
• Support and Troubleshooting: Provide support and troubleshoot issues related to applications, systems, and infrastructure
• Cloud Management: Efficiently manage and monitor cloud resources, implementing autoscaling and other techniques to maintain optimal performance12
Skills and Qualifications:
Central Data Lake Platform innovation:
• Identify development tasks where the implementation time can be reduced: standard and configurable data factory pipelines, databricks classes, sql data model for configuration, control and logging of data pipelines, generic logic apps, azure functions and arm templates
• Work with Delivery Team Managers within Data Platform team to review and identify developments and processes that could be parametrized in the CDLP and ensure adoption across projects for optimization purposes (time and cost). • Stay abreast of trends and new capabilities of the CDLP and PowerBI platform
• Work closely with Microsoft counterpart to identify new features in the roadmap, assess and analyze and decide with early adoption would be beneficial for JTI
• Proactively analyze technology trends & JTI business demand to identify upcoming opportunities
• Drive internal proof of concept initiatives. When needed, quickly design and implement a prototype of a system or component with a proper architecture, and then upon succesfull POC, hand it over to a development teams to implement.
Who are we looking for - Requirements:
- College or University degree
- 2+ years strong experience with metadata driven frameworks, CPG.ai is a mus
- 10+ years of experience in Data and Analytics with high exposure to IT architecture and solutioning for data flows among system and applications
- 5+ years delivering solutions on the Microsoft Azure platform with a special emphasis on data solutions and services
- 5+ years delivering solutions on the Power BI platform with a special emphasis on architecture, security and maintenance.
- 5+ years of DEVOPS, CI/CD
- 5+ Phyton work experience
Are you ready to join us? Build your success story at JTI. Apply now!
Next Steps:
After applying, if selected, please anticipate the following within 1-3 weeks of the job posting closure: Phone screening with Talent Advisor > Assessment tests > Interviews > Offer. Each step is eliminatory and may vary by role type.
At JTI, we strive to create a diverse and inclusive work environment. As an equal-opportunity employer, we welcome applicants from all backgrounds. If you need any specific support, alternative formats, or have other access requirements, please let us know.
Job Segment:
Database, Cloud, Engineering Manager, Testing, Information Technology, Technology, Engineering