Find Jobs
Hire Freelancers

Run scrapy on AWS

$30-250 USD

Terminado
Publicado hace alrededor de 3 años

$30-250 USD

Pagado a la entrega
hi I'm trying to scrape a website and save scraped data in my AWS s3 bucket. I have built a spider that works on my local machine, but I want to deploy multiple spiders on AWS to run them at the same time. The spider need to read start urls from cvs file, and then save data to my s3 bucket which eventually will be migrated to snowflake - s3 bucket is my solution but other solutions would also be fine. It would be better if you can not only provide a solution but also walk me through it so I can understand the mechanics as I'm pretty new to both Scrapy and AWS. Here is my current script for your reference [login to view URL] and below is the website I'm trying to scrape [login to view URL]
ID del proyecto: 29839980

Información sobre el proyecto

6 propuestas
Proyecto remoto
Activo hace 3 años

¿Buscas ganar dinero?

Beneficios de presentar ofertas en Freelancer

Fija tu plazo y presupuesto
Cobra por tu trabajo
Describe tu propuesta
Es gratis registrarse y presentar ofertas en los trabajos
Adjudicado a:
Avatar del usuario
Hii There, My Name Is Pratik Domadiya and I Am Professional Web Scraper. I Have Been Working as a Web Scrapper for about 4+ Years. I Would Love to Discuss the Project Further. I Have Done Similar Projects as You Want Before, so It Will Not Take so Much Time. I have set up cronjobs on AWS server for my scraping projects so don't worry you want the same. I will also teach you mechanism how to do it so you can do it in future if you want. My past work includes the following projects. 1. indeed job data scraping 2. WhatsApp automation using selenium 3. food delivery website ( doordash) data scraping 4. restaurants data scraping from TripAdvisor 5. amazon listing data scraping 6. nse( national stock exchange) data scraping 7. retails web site - starquik data scraping ( within 20 minutes) and [login to view URL] product data scraping 8. Netmeds - medicine website data scraping and many more. 9. Tradingview's stock list data scraping. Deliverables in Excel, Csv, Json, Sql, Google Sheets or Any Format You Like. Also, I Have Been Developing Automation Code Using Python Selenium Web Driver, Beautiful Soup 4, and Scrapy, Which are the Fastest Right Now. I Can Show You Sample If You Want. I Have Already Worked on Several Projects of Local Clients of My City. I'd Love the Opportunity Work for You on This. Please Contact Me Soon so That We Can Start Work Early. Thank You for taking the time to read my application. Kind Regards, Pratik Domadiya
$100 USD en 2 días
5,0 (3 comentarios)
2,6
2,6
6 freelancers están ofertando un promedio de $129 USD por este trabajo
Avatar del usuario
Hello there will it be good if i create a new script to do the scraping as you are looking for? Thanks
$222 USD en 4 días
4,9 (173 comentarios)
7,7
7,7
Avatar del usuario
Hi there!. I can help you with Scrapy and AWS, also i can help you to connect the s3 bucket from the AWS instance. I can provide a full walkthrough from start to end. You can record all of my activity
$140 USD en 1 día
5,0 (30 comentarios)
5,2
5,2
Avatar del usuario
Hi, I worked on such project. I can integrate scrapy with django(web framework) from which you can schedule scraper and save that data in database or any other format. Django will enable us to use that scrapy from externally. You can run it from that on aws. I did such project before. I can do it very fastly. Thank You.
$60 USD en 2 días
5,0 (10 comentarios)
2,9
2,9
Avatar del usuario
Hello, I have gone through your requirements regarding scraping and will definitely help you   as I have 6+ years of experience in AWS & PYTHON.  I have expertise in various tools & services of AWS including :- -Redshift -Lambda -EC2 -S3 -Sagemaker -Glue -Dynamo DB -API Gateway -DMS, etc ---------------------------------------------------------------------- Type of Projects I have Worked upon :- -Creation of batch and stream data pipeline on cloud -Cloud migration -Solution accelerators -Frameworks to automate customers data pipeline -Stream data generator -ILM -Ecommerce analysis platform -Healthcare platform -Sagemaker MLOPS -Digital data platform -Banking one stop shop accelerator -Batch job processing on cloud  -Kinesis streams processor  -Database migration using DMS etc ------------------------------------------------- Awaiting your response! Thanks
$111 USD en 1 día
0,0 (0 comentarios)
0,0
0,0
Avatar del usuario
Hi, I already have experience with such functionalities. I can easily implement the solution of your problem
$140 USD en 7 días
0,0 (0 comentarios)
0,0
0,0

Sobre este cliente

Bandera de UNITED STATES
McLean, United States
0,0
0
Forma de pago verificada
Miembro desde abr 11, 2021

Verificación del cliente

¡Gracias! Te hemos enviado un enlace para reclamar tu crédito gratuito.
Algo salió mal al enviar tu correo electrónico. Por favor, intenta de nuevo.
Usuarios registrados Total de empleos publicados
Freelancer ® is a registered Trademark of Freelancer Technology Pty Limited (ACN 142 189 759)
Copyright © 2024 Freelancer Technology Pty Limited (ACN 142 189 759)
Cargando visualización previa
Permiso concedido para Geolocalización.
Tu sesión de acceso ha expirado y has sido desconectado. Por favor, inica sesión nuevamente.