Build Big Data Execution Environment (Spark and zeppelin).
$250-750 USD
Pago na entrega
TItle : Build Big Data Execution Environment (Spark and zeppelin).
Requirements
I need to physical setup of bigdata platform for my data analysis work.
- spark(SparkSQL and MLlib,R) and Zeppelin.
I'm looking for expert(s) to configure a spark and Zeppelin on my Linux Server.
I'm not a actual jave programmer and don't have a experience of bigdata area.
So You need to provide followings.
- install and configure a bigdata platform
- Spark standalone mode.
- hadoop / spark(sparkSQl and ML ,SparkR) and Zeppelin
- Use a latest version of related SW.
- You need to install related programs or packages also on my Server .
- and I need to use Zeppelin Web U for exploring a data on SPARK.
- The goal for this project is to perform data analysis on SPARK over Zeppelin.
You need to give me followings :
- for sample program , use a scala program.
- All sample programs need to be running on Zeppelin UI or scala cli.
- pre-load a few sample data(for exampe , [login to view URL] )
- sample programs for data loading into spark. - Basically data loading from text file.
- sample program for Spark query and ML.
- Very basic scala program for "Collaborative filtering" method.
- a few SparkSQLs
- sample R program for utilizing of SparkR.
- and simple documentation.
I will let you know a linux server information if contracted.
this is a initial project
and if this is completed well , I'd like to extend a project to make a real business application.
ID do Projeto: #12014263
Sobre o projeto
Concedido a:
I am a data scientist and have experience working with big data technologies like spark, hadoop, etc. I understand how to install spark and other engines at the backend of zeppelin. I would like to do this project.
14 freelancers estão ofertando em média $531 nesse trabalho
Your project is a perfect match for my skills. Looking forward to support you with this project. best regards, Joerg.
I have been working on hadoop for the past 2 years. I have very good experience deploying hadoop clusters. I helped one of our clients build 10 node haodop cluster and am still giving support maintaining it. I have a g Mais
I am expert on bigdata / spark/ambari , I can do this within a week and able to give KT to you or your team.
Can you also provide more details about your environment it is a Linux Fedora or ubuntu box, ram details etc . Also do you need a single node cluster or multi node . Spark and hadoop versions required
hadoop certified developer having 7 years of exp . good exposure on hadoop and eco systems like spark Java scala
Interested to work for this project Total experience: 16+ yrs in java, j2ee VISA: USA valid till 2024 Resume Summary: Having 16+yrs experience in IT industry, handling 180+ resources team, 4+ geographic locat Mais