Instructions To Be Followed Before Applying for the Job:
Read the Job Description: Start by carefully reading the job description. Make sure you understand what the job involves.
Check Your Skills: Compare the skills they're looking for with what you know how to do. Figure out where you're strong and where you might need to learn more.
Check Eligibility Criteria: Look at the qualifications they want, like education and experience. Make sure you have what they're asking for.
Update Your Resume and Cover Letter: Change your resume and cover letter to match the job. Highlight the things that make you a good fit.
Learn About the Company: Take some time to find out about the company you want to work for. Understand what they do and what they care about. This helps you show you're interested in them.
Check Your Skills: Compare the skills they're looking for with what you know how to do. Figure out where you're strong and where you might need to learn more.
Check Eligibility Criteria: Look at the qualifications they want, like education and experience. Make sure you have what they're asking for.
Update Your Resume and Cover Letter: Change your resume and cover letter to match the job. Highlight the things that make you a good fit.
Learn About the Company: Take some time to find out about the company you want to work for. Understand what they do and what they care about. This helps you show you're interested in them.
Key Responsibilities:
- Build & maintain high tps, reliable, performant and cost-effective data collection and extraction modules using Node.js & Python, using streaming solutions like Kafka.
- Deploy, maintain and support these modules on AWS & GCP cloud.
- Index, archive and retain necessary data in multiple persistence stores like Object stores(S3), Key value store (Dynamo DB), and Elastic Search based on the use case.
- Manage the quality of data collected using data quality libraries built using SQL/Python/Spark on AWS Glue and exposed as Dashboards for monitoring using AWS Quick sight and Kibana.
- Restfully abstract the data collected to the downstream applications through a Node.js backend.
- Collaborate well with engineers, researchers, and data implementation specialists to design and create advanced, elegant and efficient end to end competitive intelligence solutions.
Skills Required:
- Proven experience as a Software Development Engineer that has built, deployed and operationally supported systems in production.
- Excellent knowledge of programming languages such as Node.JS, Python
- Strong understanding of software design patterns, algorithms, and data structures
- Experience with SQL & NoSQL databases.
- Good communication and collaboration skills.
- Works with good ownership and accountability.
- Ability to work in a fast-paced and dynamic environment.
- Experience in writing high volume/tps, reliable crawlers and scrapers is a plus.
- Bachelor's or master's degree in computer science or a related field.
Eligibility Criteria:
1)Any graduate
Location: Remote