Eaton Recruitment | Data Engineering Apprentice | Pune
| Company | Location | Position | Apply Link |
|---|---|---|---|
| Today Walkins | Across India | Multiple Positions | Check Here |
| Top MNCs | Across India | Internships | Apply Here |
| Infosys | Across India | Specialist Programmer/Digital Specialist Engineer Trainee (BE/BTech/ME/MTech/MCA/MSc-2026 Batch) | Apply Here |
| Cogniznat | Across India | Analyst Trainee (Graduation-2026 Batch) | Apply Here |
| Cogniznat | Across India | Analyst Trainee (Graduation-2024/2025 Batch) | Apply Here |
| Amazon | Across India | Multiple Positions | Apply Here |
| Accenture | Across India | Multiple Positions | Apply Here |
| Wipro | Across India | Multiple Positions | Apply Here |
| Cognizant | Across India | Multiple Positions | Apply Here |
| Genpact | Across India | Multiple Positions | Apply Here |
Eaton Recruitment in Pune For Data Engineering Apprentice. Undergraduates and Graduates are eligible to apply for this job. More details regarding Eaton Pune Job Openings is given below.
Company Name: Eaton
Job Location: Pune
Job Category: IT | Software
Job Position: Data Engineering Apprentice
Qualification: Undergraduates, Graduates
Job Description:
- Eaton is hiring for Data Engineering Apprentice.
- Work Type: Hybrid.
- Work Location: Pune.
- The Data Engineering Apprentice will be part of the Digital Finance Data Engineering team and will support the design, development, and operation of enterprise data pipelines and data platforms.
- This role is intended for early career candidates who are eager to build strong foundations in modern data engineering practices while working in a governed, enterprise-scale environment.
- The apprentice will work under the guidance of senior data engineers and managers, gaining hands-on experience with cloud data platforms, data integration, data modeling standards, and finance-domain datasets.
Key Responsibilities:
Data Engineering & Platform Support
- Assist in building and maintaining data pipelines for ingesting, transforming, and validating data from various source systems.
- Support data transformations using SQL and Python under established engineering standards.
- Help with data quality checks, reconciliation processes, and basic troubleshooting of data issues.
- Participate in documenting data pipelines, table definitions, and engineering artifacts.
Learning & Engineering Practices
- Learn and apply modern data engineering practices including ELT/ETL pipelines, version control, and CI/CD fundamentals.
- Follow enterprise data engineering standards for naming conventions, data modeling, and code quality as defined by the team.
- Gain hands-on exposure to cloud data platforms such as Snowflake and Azure-based data services.
- Participate in code reviews and technical walkthroughs as a learning opportunity.
Collaboration & Communication
- Work closely with senior data engineers, analysts, and product owners to understand business and technical requirements.
- Support team activities such as sprint planning, backlog grooming, and sprint reviews in an Agile delivery model.
- Communicate progress, issues, and learnings clearly to mentors and team members.
Data Governance & Compliance
- Learn and adhere to data governance, security, and access control standards.
- Assist in implementing basic data validation, audit columns, and control checks required for enterprise and finance data.
Eligibility Criteria:
- Undergraduate (or recent graduate) in Computer Science, Information Technology, Data Science, Engineering, or a related field.
Technical Skills (Basic/Foundational):
- Fundamental knowledge of SQL (SELECT, JOINs, basic aggregations).
- Basic programming knowledge in Python or a similar language.
- Understanding of relational databases and data concepts (tables, keys, data types).
- Familiarity with basic data engineering or analytics concepts is a plus.
- Strong willingness to learn and take feedback positively.
- Good analytical and problem-solving skills.
- Clear written and verbal communication skills.
- Ability to work collaboratively in a team environment.
- Exposure to cloud platforms (Azure, AWS, or GCP) in coursework or projects.
- Basic familiarity with Snowflake, Spark, or data integration tools (e.g., ADF) is an advantage but not mandatory.
- Academic or personal projects involving data pipelines or databases.
- Interest in finance or enterprise data domains.
How To Apply:
IT Jobs: Check Here
Apprentice Jobs: Check Here
Pune Jobs: Check Here
| Category | Jobs |
|---|---|
| Location | Pan India - Ahmedabad - AndhraPradesh - Bangalore - Chennai - Coimbatore - Delhi - Gurgaon - Hyderabad - Kolkata - Mumbai - Noida - Pune |
| Batch | 2026 - 2025 - 2024 - 2023 - 2022 - 2021 - 2020 |
| Education | 12th - Diploma - Graduation - PG - BE - BTech - ME - MTech - BCA - MCA - BSc - MSc - BCom - MCom - BBA - BBM - MBA - BA - MA |
| Job Type | Walkins - IT Jobs - BPO Jobs - Internships - Java Jobs - Testing Jobs - Exp Jobs |

