Hadoop mapreduce trabalhos

Filtro

Minhas pesquisas recentes
Filtrar por:
Orçamento
para
para
para
Tipo
Habilidades
Idiomas
    Estado do Trabalho
    2,000 hadoop mapreduce trabalhos encontrados, preços em EUR

    Entrada: tupla (id,termo) em que "id" é o identificador do documento e "termo" é uma palavra do texto já pré-processada. (Pseudocod/Python/PySpark/Spark)

    €93 (Avg Bid)
    €93 Média
    2 ofertas

    Desenvolvimento de algoritmo. sobre MapReduce, utilizando Pyspark/Spark...

    €9 - €28
    €9 - €28
    0 ofertas

    Experiência em Sistemas operacionais(Linux e Windows); Experiência em soluções para integração do Kerberos via MIT Kerberos e/ou Microsoft Active Directory; Conhecimento na solução Cloudera para autenticação de usuários e integração ao sistema de permissões do Sentry; Implementação de PCI-DSS sobre plataforma Cloudera ( KERBEROS, SSL, CRIPTOGRAFIA HDFS e os componente do Ecosistema Hadoop); Implementação de Auditoria plataforma Cloudera. Formação: Em T.I (Sistemas de informação, Ciências da Computação e afins) - Formado ou Cursando. Conhecimentos e Habilidades: Solução Cloudera para autentica&cce...

    €17 / hr (Avg Bid)
    €17 / hr Média
    3 ofertas

    Desarrolladores symfony con experiencia en frontend y backend y conocimientos en :<br /><br />-Nociones claras de API RESTful<br />-PHP: <br />*Symfony 2 / silex / SLIM<br />*Doctrine<br />*Composer<br />*Zend Framework<br />-Bases de datos<br />*Conocimiento avanzado sobre MySQL<br />-Optimización de consultas en MySQL<br />-Motores MyISAM / InnoDB<br />*Sistemas NoSQL MongoDB / Hadoop<br />*Redis / Memcached<br />-Apache / Nginx / Varnish<br />*Rewriting / htaccess<br />-Javascript:<br />*NPM / NodeJS <br />*Express<br />* Gulp / Grunt<br />-Conocimientos de HTTP2, websockets, AMQP<br />-Experiencia en desarrollo con Linux y GIT<br />...

    €256 (Avg Bid)
    €256 Média
    13 ofertas
    Desarrollador Web Encerrado left

    Experiencia más de 2 años en desarrollos web.<br />Lenguajes del servidor como Java, Python, NodeJs o Php.<br />Lenguajes del lado del cliente: JavaScript, HTML5, CSS3.<br />Frameworks servidor: WebPy, Laravel.<br />Frameworks cliente: Jquery, Bootstrap.<br />Nivel de Inglés: Intermedio<br />Deseable conocimiento en Redis, Nuevas arquitecturas tecnológicas, Amazon Web Services, Laravel Framework, Hadoop y No Sql.<br />Salario a convenir.

    €226 (Avg Bid)
    €226 Média
    1 ofertas
    Gateway de SMS Encerrado left

    Desenvolver Gateway de SMS.<br /><br />Requisitos/Conhecimentos técnicos:<br />Java<br />Big Data (Hadoop)<br />NoSQL <br />                                 <br />Salário e benefícios a combinar<br /> <br />I

    €661 (Avg Bid)
    €661 Média
    4 ofertas

    Realizar a instalação, configuração e manutenção de um cluster Hadoop com Spark.<br />Será um cluster pequeno com aproximadamente 4 máquinas (1 mestre e 3 escravos), essa instalação deverá ser feita com o Apache Ambari (ferramenta para automatização de instalação de clusters).<br />Esse cluster também pode/deve conter outras ferramentas além do Hadoop. (Isso inclui Hue, Hive, Pig, Zeppelin etc)<br /><br />Depois disso, será feito um trabalho de extração de dados do cliente, essa extração compreende aproximadamente 70 tabelas de um banco de dados Sql Server.<br />O volume de dados é entre ...

    €226 (Avg Bid)
    €226 Média
    3 ofertas

    ...en Apache Hadoop para impartir una sesión de formación de 25horas repartidos en 5 días por zona central de Madrid. Las fechas previstas son: la semana del 23 de noviembre o la semana del 30 de noviembre. El horario será de 9h a 14h<br /><br />El profesional debe tener experiencia hablando en público. Deberá crear su material de apoyo en formato powerpoint y un script de instalación de cada una de las herramientas a utilizar en el curso. El material de apoyo deberá entregarse totalmente terminado una semana antes del curso, pero se harán revisiones de avances desde que inicie la contratación<br /><br />Cada punto del temario debe tener sesión práctica asociada<br /><...

    €1812 (Avg Bid)
    €1812 Média
    1 ofertas

    I have encountered a problem with my Hadoop project and need assistance. My system is showing ": HADOOP_HOME and are unset", and I am not certain if I've set the HADOOP_HOME and variables correctly. This happens creating a pipeline release in devops. In this project, I am looking for someone who: - Has extensive knowledge about Hadoop and its environment variables - Can determine whether I have set the HADOOP_HOME and variables correctly and resolve any issues regarding the same - Able to figure out the version of Hadoop installed on my system and solve compatibility issues if any I will pay for the solution immediately.

    €20 / hr (Avg Bid)
    €20 / hr Média
    14 ofertas

    *Title: Freelance Data Engineer* *Description:* We are seeking a talented freelance data engineer to join our team on a project basis. The ideal candidate will have a strong background in data engineering, with expertise in designing, implementing, and maintaining data pipelines and infrastructure. You will work closely with our data scientists and analysts to ensure the smooth flow of data from various sources to our data warehouse, and to support the development of analytics and machine learning solutions. This is a remote position with flexible hours. *Responsibilities:* - Design, build, and maintain scalable and efficient data pipelines to collect, process, and store large volumes of data from diverse sources. - Collaborate with data scientists and analysts to understand data require...

    €78 (Avg Bid)
    €78 Média
    3 ofertas

    As a beginner, I am seeking a knowledgeable developer who can guide me on effectively using Google Cloud for Hadoop, Spark, Hive, pig, and MR. The main goal is data processing and analysis. Key Knowledge Areas Needed: - Google Cloud usage for big data management - Relevant functionalities of Hadoop, Spark, Hive, pig, and MR - Best practices for data storage, retrieval, and workflow streamlining Ideal Skills: - Extensive Google Cloud experience - Proficiency in Hadoop, Spark, Hive, Pig, and MR for data processes - Strong teaching abilities for beginners - Demonstrated experience in data processing and analysis.

    €158 (Avg Bid)
    €158 Média
    14 ofertas

    As a beginner, I am seeking a knowledgeable developer who can guide me on effectively using Google Cloud for Hadoop, Spark, Hive, pig, and MR. The main goal is data processing and analysis. Key Knowledge Areas Needed: - Google Cloud usage for big data management - Relevant functionalities of Hadoop, Spark, Hive, pig, and MR - Best practices for data storage, retrieval, and workflow streamlining Ideal Skills: - Extensive Google Cloud experience - Proficiency in Hadoop, Spark, Hive, Pig, and MR for data processes - Strong teaching abilities for beginners - Demonstrated experience in data processing and analysis.

    €171 (Avg Bid)
    €171 Média
    26 ofertas

    As a beginner, I am seeking a knowledgeable developer who can guide me on effectively using Google Cloud for Hadoop, Spark, Hive, pig, and MR. The main goal is data processing and analysis. Key Knowledge Areas Needed: - Google Cloud usage for big data management - Relevant functionalities of Hadoop, Spark, Hive, pig, and MR - Best practices for data storage, retrieval, and workflow streamlining Ideal Skills: - Extensive Google Cloud experience - Proficiency in Hadoop, Spark, Hive, Pig, and MR for data processes - Strong teaching abilities for beginners - Demonstrated experience in data processing and analysis.

    €19 (Avg Bid)
    €19 Média
    11 ofertas

    ...commonly used packages specially with GCP. Hands on experience on Data migration and data processing on the Google Cloud stack, specifically: Big Query Cloud Dataflow Cloud DataProc Cloud Storage Cloud DataPrep Cloud PubSub Cloud Composer & Airflow Experience designing and deploying large scale distributed data processing systems with few technologies such as PostgreSQL or equivalent databases, SQL, Hadoop, Spark, Tableau Hands on Experience with Python-Json nested data operation. Exposure or Knowledge of API design, REST including versioning, isolation, and micro-services. Proven ability to define and build architecturally sound solution designs. Demonstrated ability to rapidly build relationships with key stakeholders. Experience of automated unit testing, automated integra...

    €12 / hr (Avg Bid)
    €12 / hr Média
    11 ofertas

    I am looking for a skilled professional who can efficiently set up an big data cluster. REQUIREMENTS: • Proficiency in Elasticsearch,Hadoop,Spark,Cassandra • Experience in working with large-scale data storage (10+ terabytes). • Able to structure data effectively. SPECIFIC TASKS INCLUDE: - Setting up the Elasticsearch,Hadoop,Spark,Cassandra big data cluster. - Ensuring the data to be stored is structured. - Prep for the ability to handle more than 10 terabytes of data. The ideal candidate will have substantial experience in large data structures and a deep understanding of the bigdata database technology. I encourage experts in big data management and those well-versed with the best practices of bigdata to bid for this project.

    €28 / hr (Avg Bid)
    €28 / hr Média
    3 ofertas

    We are looking for an Informatica BDM developer with 7+ yrs of experience, who can support us for 8 hours in a day from Mon - Friday. Title : Informatica BDM Developer Experience : 5 + Yrs 100%Remote Contract : Long term Timings: 10:30 am - 07:30 pm IST Required Skills: Informatica Data Engineering, DIS and MAS • Databricks, Hadoop • Relational SQL and NoSQL databases, including some of the following: Azure Synapse/SQL DW and SQL Database, SQL Server and Oracle • Core cloud services from at least one of the major providers in the market (Azure, AWS, Google) • Agile Methodologies, such as SCRUM • Task tracking tools, such as TFS and JIRA

    €1137 (Avg Bid)
    €1137 Média
    3 ofertas

    I am seeking a skilled professional proficient in managing big data tasks with Hadoop, Hive, and PySpark. The primary aim of this project involves processing and analyzing structured data. Key Tasks: - Implementing Hadoop, Hive, and PySpark for my project to analyze large volumes of structured data. - Use Hive and PySpark for sophisticated data analysis and processing techniques. Ideal Skills: - Proficiency in Hadoop ecosystem - Experience with Hive and PySpark - Strong background in working with structured data - Expertise in big data processing and data analysis - Excellent problem-solving and communication skills Deliverables: - Converting raw data into useful information using Hive and Visualizing the results of queries into the graphical representations. - C...

    €16 / hr (Avg Bid)
    €16 / hr Média
    15 ofertas

    ...currently seeking a Hadoop Professional with strong expertise in Pyspark for a multi-faceted project. Your responsibilities will extend to but not limited to: - Data analysis: You'll be working with diverse datasets including customer data, sales data and sensor data. Your role will involve deciphering this data, identifying key patterns and drawing out impactful insights. - Data processing: A major part of this role will be processing the mentioned datasets, and preparing them effectively for analysis. - Performance optimization: The ultimate aim is to enhance our customer targeting, boost sales revenue and identify patterns in sensor data. Utilizing your skills to optimize performance in these sectors will be highly appreciated. The ideal candidate will be skilled in ...

    €429 (Avg Bid)
    €429 Média
    25 ofertas

    ...R), and other BI essentials, join us for global projects. What We're Looking For: Business Intelligence Experts with Training Skills: Data analysis, visualization, and SQL Programming (Python, R) Business acumen and problem-solving Effective communication and domain expertise Data warehousing and modeling ETL processes and OLAP Statistical analysis and machine learning Big data technologies (Hadoop, Spark) Agile methodologies and data-driven decision-making Cloud technologies (AWS, Azure) and data security NoSQL databases and web scraping Natural Language Processing (NLP) and sentiment analysis API integration and data architecture Why Work With Us: Global Opportunities: Collaborate worldwide across diverse industries. Impactful Work: Empower businesses through data-drive...

    €19 / hr (Avg Bid)
    €19 / hr Média
    24 ofertas

    I'm launching an extensive project that needs a proficient expert in Google Cloud Platform (including BigQuery, GCS, Airflow/Composer), Hadoop, Java, Python, and Splunk. The selected candidate should display exemplary skills in these tools, and offer long-term support. Key Responsibilities: - Data analysis and reporting - Application development - Log monitoring and analysis Skills Requirements: - Google Cloud Platform (BigQuery, GCS, Airflow/Composer) - Hadoop - Java - Python - Splunk The data size is unknown at the moment, but proficiency in managing large datasets will be advantageous. Please place your bid taking into account all these factors. Your prior experience handling similar projects will be a plus. I look forward to working with a dedicated and know...

    €452 (Avg Bid)
    €452 Média
    54 ofertas

    ...commonly used packages specially with GCP. Hands on experience on Data migration and data processing on the Google Cloud stack, specifically: Big Query Cloud Dataflow Cloud DataProc Cloud Storage Cloud DataPrep Cloud PubSub Cloud Composer & Airflow Experience designing and deploying large scale distributed data processing systems with few technologies such as PostgreSQL or equivalent databases, SQL, Hadoop, Spark, Tableau Hands on Experience with Python-Json nested data operation. Exposure or Knowledge of API design, REST including versioning, isolation, and micro-services. Proven ability to define and build architecturally sound solution designs. Demonstrated ability to rapidly build relationships with key stakeholders. Experience of automated unit testing, automated integra...

    €13 / hr (Avg Bid)
    €13 / hr Média
    6 ofertas

    As an ecommerce platform looking to optimize our data management, I require assistance with several key aspects of my AWS big data project, including: - Data lake setup and configuration - Development of AWS Glue jobs - Deployment of Hadoop and Spark clusters - Kafka data streaming The freelancer hired for this project must possess expertise in AWS, Kafka, and Hadoop. Strong experience with AWS Glue is essential given the heavy utilization planned for the tool throughout the project. Your suggestions and recommendations regarding these tools and technologies will be heartily welcomed, but keep in mind specific tools are needed to successfully complete this project.

    €782 (Avg Bid)
    €782 Média
    20 ofertas

    I'm in search of a professional proficient in AWS and MapReduce. My project involves: - Creation and execution of MapReduce jobs within the AWS infrastructure. - Specifically, these tasks will focus on processing a sizeable amount of text data. - The goal of this data processing is to perform an in-depth word frequency analysis, thereby extracting meaningful answers prompted by the data. The ideal freelancer for this job will have substantial experience handling data within these systems. Expertise in optimizing performance of MapReduce jobs is also greatly desirable. For anyone dabbling in AWS, MapReduce and data analytics, this project can provide a challenging and rewarding experience.

    €35 (Avg Bid)
    €35 Média
    6 ofertas

    ...Queries: Write a SQL query to find the second highest salary. Design a database schema for a given problem statement. Optimize a given SQL query. Solution Design: Design a parking lot system using object-oriented principles. Propose a data model for an e-commerce platform. Outline an approach to scale a given algorithm for large datasets. Big Data Technologies (if applicable): Basic questions on Hadoop, Spark, or other big data tools. How to handle large datasets efficiently. Writing map-reduce jobs (if relevant to the role). Statistical Analysis and Data Processing: Write a program to calculate statistical measures like mean, median, mode. Implement data normalization or standardization techniques. Process and analyze large datasets using Python libraries like Pandas. Rememb...

    €7 / hr (Avg Bid)
    €7 / hr Média
    36 ofertas

    ...customer-centric software products · Analyze existing software implementations to identify areas of improvement and provide deadline estimates for implementing new features · Develop software applications using technologies that include and not limited to core Java (11+ ), Kafka or messaging system, Web Frameworks like Struts / Spring, relational (Oracle) and non-relational databases (SQL, MongoDB, Hadoop, etc), with RESTful microservice architecture · Implement security and data protection features · Update and maintain documentation for team processes, best practices, and software runbooks · Collaborating with git in a multi-developer team · Appreciation for clean and well documented code · Contribution to database design ...

    €1303 (Avg Bid)
    €1303 Média
    51 ofertas

    a project of data analysis/data engineering involving big data needs to be done. Candidate must have command on big data solutions like hadoop

    €10 / hr (Avg Bid)
    €10 / hr Média
    8 ofertas

    I'm in search of an intermediate-level Java programmer well-versed in MapReduce. Your responsibility will be to implement the conceptual methods outlined in a given academic paper. What sets this task apart is that you're encouraged to positively augment the methodologies used: • Efficiency: Be creative with the paper's strategies and look for room for improvement in the program's efficiency. This could include enhancements to the program's capacity to process data, or to its speed. Ideal candidate should be seasoned in Java Programming, specifically MapReduce operations. Moreover, the ability to critically analyze and improve upon existing concepts will ensure success in this task. Don't hesitate to innovate, as long as you maintain the ...

    €129 (Avg Bid)
    €129 Média
    33 ofertas
    Hadoop administrator Encerrado left

    Project Title: Advanced Hadoop Administrator Description: - We are seeking an advanced Hadoop administrator for an inhouse Hadoop setup project. - The ideal candidate should have extensive experience and expertise in Hadoop administration. - The main tasks of the Hadoop administrator will include data processing, data storage, and data analysis. - The project is expected to be completed in less than a month. - The Hadoop administrator will be responsible for ensuring the smooth functioning of the Hadoop system and optimizing its performance. - The candidate should have a deep understanding of Hadoop architecture, configuration, and troubleshooting. - Experience in managing large-scale data processing and storage environments is requi...

    €287 (Avg Bid)
    €287 Média
    3 ofertas

    I am looking for a freelancer to help me with a Proof of Concept (POC) project focusing on Hadoop. Requirement: We drop a file in HDFS, which is then pushed to Spark or Kafka and it pushes final output/results into a database. Objective is to show we can handle million of records as input and put it in destination. The POC should be completed within 3-4 days and should have a simple level of complexity. Skills and experience required: - Strong knowledge and experience with Hadoop - Familiarity with HDFS and Kafka/Spark - Ability to quickly understand and implement a simple POC project - Good problem-solving skills and attention to detail

    €157 (Avg Bid)
    €157 Média
    9 ofertas
    Hadoop HDFS Setup Encerrado left

    ...upload the Hadoop package to HDFS. Use commands to show the IP addresses of all DataNodes. Provide detailed information (ls -l) of the blocks on each DataNode. Provide detailed information (ls -l) of the fsimage file and edit log file. Include screenshots of the Overview module, Startup Process module, DataNodes module, and Browse Directory module on the Web UI of HDFS. MapReduce Temperature Analysis You are given a collection of text documents containing temperature data. Your task is to implement a MapReduce program to find the maximum and minimum temperatures for each year. Data Format: Year: Second item in each line Minimum temperature: Fourth item in each line Maximum temperature: Fifth item in each line Submission Requirements: Submit the source code...

    €14 (Avg Bid)
    €14 Média
    2 ofertas

    Big data project in java needed to be done in 24 hrs. Person needs to be experienced in spark. hadoop.

    €122 (Avg Bid)
    €122 Média
    10 ofertas
    Big data Architect Encerrado left

    Looking for hadoop specialist to design the query optimisation design . Currently when the search is made its getting freezing when the user tries to run more than one search at a time . Need to implement a solution . This is a remote project . Share your idea first if you have done any such work . Here the UI is in React and Backend is in Node js .

    €15 / hr (Avg Bid)
    €15 / hr Média
    38 ofertas

    #Your code goes here import '' import '' def jbytes(*args) { |arg| arg.to_s.to_java_bytes } end def put_many(table_name, row, column_values) table = (@, table_name) p = (*jbytes(row)) do |column, value| family, qualifier = (':') (jbytes(family, qualifier), jbytes(value)) end (p) end # Call put_many function with sample data put_many 'wiki', 'DevOps', { "text:" => "What DevOps IaC do you use?", "revision:author" => "Frayad Gebrehana", "revision:comment" => "Terraform" } # Get data from the 'wiki' table get 'wiki', 'DevOps' #Do not remove the exit call below exit

    €56 (Avg Bid)
    €56 Média
    7 ofertas

    I am in need of assistance with Hadoop for the installation and setup of the platform. Skills and experience required: - Proficiency in Hadoop installation and setup - Knowledge of different versions of Hadoop (Hadoop 1.x and Hadoop 2.x) - Ability to work within a tight timeline (project needs to be completed within 7 hours) Please note that there is no specific preference for the version of Hadoop to be used.

    €12 (Avg Bid)
    €12 Média
    2 ofertas
    Trophy icon Website WordPress- content in image Encerrado left

    Wordpress Black theme Design in photo Images can take from udemy Content here Content Coupon Code: 90OFFOCT23 (subscribe by 7 Oct’23 or till stock lasts) Data Engineering Career Path: Big Data Hadoop and Spark with Scala: Scala Programming In-Depth: Apache Spark In-Depth (Spark with Scala): DP-900: Microsoft Azure Data Fundamentals: Data Science Career Path: Data Analysis In-Depth (With Python): https://www

    €6 (Avg Bid)
    Garantido
    €6
    4 inscrições

    Seeking an expert in both Hadoop and Spark to assist with various big data projects. The ideal candidate should have intermediate level expertise in both Hadoop and Spark. Skills and experience needed for the job: - Proficiency in Hadoop and Spark - Intermediate level expertise in Hadoop and Spark - Strong understanding of big data concepts and tools - Experience working on big data projects - Familiarity with data processing and analysis using Hadoop and Spark - Ability to troubleshoot and optimize big data tools - Strong problem-solving skills and attention to detail

    €20 / hr (Avg Bid)
    €20 / hr Média
    12 ofertas

    I am looking for a freelancer to compare the performance metrics of Hadoop, Spark, and Kafka using the data that I will provide. Skills and experience required: - Strong knowledge of big data processing architectures, specifically Hadoop, Spark, and Kafka - Proficiency in analyzing and comparing performance metrics - Ability to present findings through written analysis, graphs and charts, and tables and figures The comparison should focus on key performance metrics such as processing speed, scalability, fault tolerance, throughput, and latency. The freelancer should be able to provide a comprehensive analysis of these metrics and present them in a clear and visually appealing manner. I will explain more about the data

    €146 (Avg Bid)
    €146 Média
    23 ofertas

    Looking for Hadoop Hive Experts I am seeking experienced Hadoop Hive experts for a personal project. Requirements: - Advanced level of expertise in Hadoop Hive - Strong understanding of big data processing and analysis - Proficient in Hive query language (HQL) - Experience with data warehousing and ETL processes - Familiarity with Apache Hadoop ecosystem tools (e.g., HDFS, MapReduce) - Ability to optimize and tune Hadoop Hive queries for performance If you have a deep understanding of Hadoop Hive and can effectively analyze and process big data, then this project is for you. Please provide examples of your previous work in Hadoop Hive and any relevant certifications or qualifications. I am flexible with the timeframe for complet...

    €19 (Avg Bid)
    €19 Média
    2 ofertas
    Kafka Admin Encerrado left

    I am looking for a Kafka Admin who can assist me with the following tasks: - Onboarding Kafka cluster - Managing Kafka topics and partitions - Its already available in the company and we need to onboard it for our project . -Should be able to Size and scope . - We will start with small data ingestion from Hadoop datalake . -Should be willing to work on remote machine . The ideal candidate should have experience in: - Setting up and configuring Kafka clusters - Managing Kafka topics and partitions - Troubleshooting Kafka performance issues The client already has all the necessary hardware and software for the Kafka cluster setup.

    €17 / hr (Avg Bid)
    €17 / hr Média
    11 ofertas

    Over the past years, I have devoted myself to a project involving Algorithmic Trading. My system leverages only pricing and volume data at market closing. It studies technical indicators for every stock in the S&P 500 from its IPO date, testing all possible indicator 'settings', as I prefer to call them. This process uncovers microscopic signals that suggest beneficial buying at market close and selling at the next day's close. Any signal with a p-value below 0.01 is added to my portfolio. Following this, the system removes correlated signals to prevent duplication. A Bayesian ranking of signals is calculated, and correlated signals with a lower rank are eliminated. The result is a daily optimized portfolio of buy/sell signals. This system, primarily built with numpy...

    €35 / hr (Avg Bid)
    ADC
    €35 / hr Média
    13 ofertas

    I am looking for a Hadoop developer with a strong background in data analysis. The scope of the project involves analyzing and interpreting data using Hadoop. The ideal candidate should have experience in Hadoop data analysis and be able to work on the project within a timeline of less than 1 month.

    €229 (Avg Bid)
    €229 Média
    4 ofertas

    I am looking for a Hadoop developer with a strong background in data analysis. The scope of the project involves analyzing and interpreting data using Hadoop. The ideal candidate should have experience in Hadoop data analysis and be able to work on the project within a timeline of less than 1 month.

    €11 (Avg Bid)
    €11 Média
    3 ofertas
    Big Data Statistics Encerrado left

    1: model and implement efficient big data solutions for various application areas using appropriately selected algorithms and data structures. 2: analyse methods and algorithms, to compare and evaluate them with respect to time and space requirements and make appropriate design choices when solving real-world problems. 3: motivate and explai...choices when solving real-world problems. 3: motivate and explain trade-offs in big data processing technique design and analysis in written and oral form. 4: explain the Big Data Fundamentals, including the evolution of Big Data, the characteristics of Big Data and the challenges introduced. 6: apply the novel architectures and platforms introduced for Big data, i.e., Hadoop, MapReduce and Spark complex problems on Hadoop execu...

    €119 (Avg Bid)
    €119 Média
    9 ofertas

    I am looking for a freelancer who can help me with an issue I am fac...who can help me with an issue I am facing with launching Apache Gobblin in YARN. Here are the details of the project: Error Message: NoClassDefFoundError (Please note that this question was skipped, so the error message may not be accurate) Apache Gobblin Version: 2.0.0 YARN Configuration: Not sure Skills and Experience: - Strong knowledge and experience with Apache Gobblin - Expertise in Hadoop,YARN configuration and troubleshooting - Familiarity with Interrupt exception and related issues - Ability to diagnose and resolve issues in a timely manner - Excellent communication skills to effectively collaborate with me and understand the problem If you have the required skills and experience, please bid on thi...

    €23 / hr (Avg Bid)
    €23 / hr Média
    10 ofertas
    Big data processing Encerrado left

    Write MapReduce programs that give you a chance to develop an understanding of principles when solving complex problems on the Hadoop execution platform.

    €23 (Avg Bid)
    €23 Média
    9 ofertas
    Python Expert Encerrado left

    I am looking for a Python expert who can help me with a specific task of implementing a MapReducer. The ideal candidate should have the following skills and experience: - Proficient in Python programming language - Strong knowledge and experience in MapReduce framework - Familiarity with web scraping, data analysis, and machine learning would be a plus The specific library or framework that I have in mind for this project is [insert library/framework name]. I have a tight deadline for this task, and I prefer it to be completed in less than a week.

    €10 / hr (Avg Bid)
    €10 / hr Média
    52 ofertas
    Mapreduce Program Encerrado left

    I am looking for a freelancer to develop a Mapreduce program in Python for data processing. The ideal candidate should have experience in Python programming and a strong understanding of Mapreduce concepts. Requirements: - Proficiency in Python programming language - Knowledge of Mapreduce concepts and algorithms - Ability to handle large data sets efficiently - Experience with data processing and manipulation - Familiarity with data analysis and mining techniques The program should be flexible enough to handle any data set, but the client will provide specific data sets for the freelancer to work with. The freelancer should be able to process and analyze the provided data sets efficiently using the Mapreduce program.

    €100 (Avg Bid)
    €100 Média
    27 ofertas

    It's java hadoop mapreduce task. The program should run on windows OS. An algorithm must be devised and implemented that can recognize the language of a given text. Thank you.

    €31 (Avg Bid)
    €31 Média
    8 ofertas

    Looking for a freelancer to help with a simple Hadoop SPARK task focusing on data visualization. The ideal candidate should have experience in: - Hadoop and SPARK - Data visualization tools and techniques - Ability to work quickly and deliver results as soon as possible. The task is: Use the following link to get the Dataset: Write a report that contains the following steps: 1. Write steps of Spark & Hadoop setup with some screenshots. 2. Import Libraries and Set Work Background (Steps +screen shots) 3. Load and Discover Data (Steps +screen shots + Codes) 4. Data Cleaning and Preprocessing (Steps +screen shots + Codes) 5. Data Analysis - Simple Analysis (explanation, print screen codes) - Moderate Analysis (explanation

    €31 (Avg Bid)
    €31 Média
    8 ofertas