Read the Job Description: Start by carefully reading the job description. Make sure you understand what the job involves.
Check Your Skills: Compare the skills they're looking for with what you know how to do. Figure out where you're strong and where you might need to learn more.
Check Eligibility Criteria: Look at the qualifications they want, like education and experience. Make sure you have what they're asking for.
Update Your Resume and Cover Letter: Change your resume and cover letter to match the job. Highlight the things that make you a good fit.
Learn About the Company: Take some time to find out about the company you want to work for. Understand what they do and what they care about. This helps you show you're interested in them.
Check Your Skills: Compare the skills they're looking for with what you know how to do. Figure out where you're strong and where you might need to learn more.
Check Eligibility Criteria: Look at the qualifications they want, like education and experience. Make sure you have what they're asking for.
Update Your Resume and Cover Letter: Change your resume and cover letter to match the job. Highlight the things that make you a good fit.
Learn About the Company: Take some time to find out about the company you want to work for. Understand what they do and what they care about. This helps you show you're interested in them.
Responsibilities:
- Data Pipeline Development: Assist in the design and implementation of data pipelines to extract, transform, and load (ETL) data from various sources into data warehouses or databases.
- Data Quality Assurance: Monitor and ensure the quality and integrity of data throughout the data lifecycle, identifying and resolving any data discrepancies or issues.
- Collaboration & Analysis: Work closely with data analysts, data scientists, and other stakeholders to understand data requirements and deliver solutions that meet business needs as well as perform analyses aligned to anchor domain.
- Documentation: Maintain clear and comprehensive documentation of data processes, pipeline architectures, and data models for reference and training purposes.
- Performance Optimization: Help optimize data processing workflows and improve the efficiency of existing data pipelines.
- Support Data Infrastructure: Assist in the maintenance and monitoring of data infrastructure, ensuring systems are running smoothly and efficiently.
- Learning and Development: Stay updated on industry trends and best practices in data engineering, actively seeking opportunities to learn and grow in the field.
Skills:
- Familiarity with programming languages such as Python or JavaScript; knowledge of SQL and experience with databases (e.g., Snowflake, MySQL, or PostgreSQL) is preferred.
- Data Tools: Exposure to data processing frameworks and tools (e.g. PBS/Torque, Slurm, or Airflow) is a plus.
- Analytical Skills: Strong analytical and problem-solving skills, with a keen attention to detail.
- Communication Skills: Excellent verbal and written communication skills, with the ability to convey technical information clearly to non-technical stakeholders.
- Team Player: Ability to work collaboratively in a team environment and contribute to group projects.
- Adaptability: Willingness to learn new technologies and adapt to changing priorities in a fast-paced environment.
Eligibility Criteria:
1)Bachelor’s degree in Computer Science, Data Science, Information Technology, or a related field preferred; relevant coursework or certifications in data engineering or programming is a plus.
Location: Chennai