Big Data and Hadoop Support
₹12500-37500 INR
Pago na entrega
Working on a hadoop project which involves Spark with scala, hive , impala, and sqoop.
Looking for daily 2 hours of support
Monthly Payout INR 20000
Please mention experience with all these technologies.
ID do Projeto: #17732704
Sobre o projeto
16 freelancers estão ofertando em média ₹22882 nesse trabalho
Hi I am a data engineer with 3+ years of experience in the industry. I have worked on deploying highly scalable, resilient and durable solutions using big data tech both in clouds and on-premise. I have expertise in f Mais
I have 2 years of experience in all these technologies and have certification in Spark and Hadoop ecosystem as well. Lets discuss in chat to finalize the deal.
Has been working on these technologies since a year and has gained a decent amount of knowledge . Hadoop - 1.6 Years Sqoop - 6 months Spark With Java - 1 Year Scala - 3 months Hive - 1 year
What would be the kind of work? Data scrubs, transformations? or what kind of processing you would need to perform?
• Possess 2years of analysis and development experience in working projects and prototypes. • Hands on experience on major components of Hadoop ecosystem like Apache Spark, Map Reduce, HDFS, HIVE, PIG, Sqoop and HBASE Mais
From past 4 years, I have been working on various components of hadoop ecosystem including Spark, Impala, hive, sqoop and various hadoop components. Yes, I can provide you the assistance required. Relevant Skills and Mais
Currenly works in hadoop project which involves Spark with scala, hive , impala, sqoop, spark and python as hadoop developer. As this work requires no experience, i can also complete this work as experience fellow. I c Mais