Employment Type: Full time
Experience: 4-6 years
Summary of Job Purpose:
A revolution is brewing, and Absolutdata IS the epicentre of the revolution called “Big Data” - A megatrend impacting every facet of business decision making. With our extensive experience converting big data into big insights, client are reaping large bottom-line and top-line results. Be it large amounts of data (Volume), fast changing or streaming data (Velocity) or multiple types of data such as unstructured data (Variety), traditional analytics are combined with emerging analytics such as Machine Learning and Artificial Intelligence to connect data with business results.
Terabytes of data, Big Data Infrastructure, Hadoop, Map Reduce, Hive etc – Do these words and phrases excite you and make you yearn for more? If yes, then we are looking for you. You will build the Big Data Infrastructure, understand the business need, recommend and finally build and implement the most optimal solution. You will work closely with the operations and analytics teams to understand data landscaping requirements and build the systems, processes and teams required while maintaining scalability, security and reliability.
- Experience in end to end solution design including all the aspects like capacity planning, requirement gathering, effort estimations, optimizing solution stacks for maximum ROI
- Practical experience and in-depth understanding of Map Reduce
- Hands-on experience with Hive/Pig/Flume/Map-REDUCE/Sqoop/
- Strong Programming background with expertise in python
- Data infrastructure tools landscape e.g. cloud service providers, software, system monitoring tools and development environments
- Ability to program and guide team members on technical aspects
- Ability to craft documents that can explain complex ideas in simple terms in order to build consensus or educate
- Knowledge of R or any other Statistical Programming Language is a plus
Qualification and Skills :
- 4-6 years of total IT experience working in back end applications with 3+ years of experience with Big Data technologies such as Hadoop and the related ecosystem
- Graduate / Postgraduate in Computer Sciences or a related field