Cloudera optimization for HDFS/Spark environment -- 2

Cerrado Publicado hace 7 años Pagado a la entrega
Cerrado Pagado a la entrega

Need to optimize performance through configuration of server environment resource utilization (mem and CPU) on nodes for a recently installed environment utilizing cloudera 5.5.

Hadoop Hive Spark Yarn

Nº del proyecto: #10626684

Sobre el proyecto

7 propuestas Proyecto remoto Activo hace 7 años

7 freelancers están ofertando un promedio de $422 por este trabajo

winnow1

We are a group of Data Scientists based in Bangalore. Our core areas of expertise are big data and machine learning. Can assist you in Hadoop configuration, cluster optimisation and later on in implementing complex Más

$775 USD en 5 días
(4 comentarios)
4.9
mahendrasinghmar

I will do it . I think you posted also earlier . At that i was busy with other client . now i am free so lets do it

$250 USD en 3 días
(1 comentario)
1.0
ITLove007

Hello I have experience of HDFS. I've build a search system using solr, hdfs(hadoop) and nutch. Solr for indexing and searching hadoop for hdfs and mapreduce nutch for crawling.

$736 USD en 5 días
(0 comentarios)
0.0
ankushkulkarni

A proposal has not yet been provided

$388 USD en 15 días
(0 comentarios)
0.0
ajithkumarkm0

I am a Cloudera and Map R certified hadoop administrator with good knowledge in Linux. Please let me know how to proceed further on this. I do have experience in hadooop cluster security, optimization and sizing Más

$277 USD en 3 días
(0 comentarios)
0.0