Hadoop Engineering Internship in Gurgaon at Myananta Technologies Private Limited
Applications are closed for this internship. Click here to browse more internships.
Start Date
Starts immediatelyImmediately
Duration
3 Months
Stipend
10,000 /month
Apply By
29 Jun' 22
The hiring for this internship will be online and the company will provide work from home/ deferred joining till current COVID-19 situation improves
Posted 3 weeks ago
Internship
About Myananta Technologies Private Limited
We are a millennial and millennial organization. We work on software and application development, both on-premise and on-demand. We are expert software professionals. We believe in a catalyst for change and new technical insight and expertise. We learn and train through project implementation in agile methodology. We follow the complete cycle of SDLC and PDLC. We are in the digital world going through an IT industry landscape change. Analytics is the bread & butter for decision making though we are still dependent on on-premise. We incorporate rising today's & future' demand of industry dynamics and corporate functions. We believe technology allows us to design, develop, and deploy expertise, analytics, and other intellectual properties through software tools. We develop software and application. We provide information technology production support.
Activity on Internshala
Hiring since April 2021
33 opportunities posted
26 candidates hired
About the internship
Selected intern's day-to-day responsibilities include:

1. Install, configure and maintain big data components or technologies (Hadoop, Hive, Spark, and Airflow) as standalone and as clusters
2. Work on cloud-specific similar components/products/services
3. Clean, transform, and analyze vast amounts of raw data from various source systems
4. Design workflows and data processing pipelines
5. Produce unit tests for big data programs and helper methods
6. Write Scaladoc-style documentation with all the code
7. Design data processing pipelines
Who can apply

Only those candidates can apply who:

1. are available for full time (in-office) internship

2. can start the internship between 15th Jun'22 and 20th Jul'22

3. are available for duration of 3 months

4. have relevant skills and interests

Other requirements

1. B.Tech or BE or similar qualifications in computer science or IT or major in computer science or IT

2. Working knowledge of Hadoop, Hive, Spark, and Airflow

3. Knowledge of Python (with a focus on the functional programming paradigm)

4. Knowledge of Scala (with a focus on the functional programming paradigm)

5. Knowledge of ScalaTest, JUnit, PyTest/PyUnit

6. Knowledge of big data query tuning and performance optimization

7. Knowledge of SQL database integration (Microsoft, Oracle, Postgres, and/or MySQL

Number of openings
1

Sign up to continue

OR

By signing up, you agree to our Terms and Conditions.