Find Jobs
Hire Freelancers

Use Azure Data Factory to fetch data from IDoSell REST API and save in Data Lake.

$10-30 CAD

Completed
Posted 5 months ago

$10-30 CAD

Paid on delivery
I am seeking a freelancer with expertise in Azure Data Factory and REST APIs for the specific task of retrieving sales data from IDoSell e-commerce and storing it in a Data Lake. The ideal candidate should possess the following skills and experience: Proficiency in Azure Data Factory and REST APIs. Experience in fetching and processing paginated sales data. Familiarity with Data Lake storage and management. Ability to work with JSON format for data storage. The primary objective of this project is to fetch sales data from the IDoSell REST API and save it in a Data Lake using Azure Data Factory, with the data being stored in JSON format. The freelancer should be capable of establishing the necessary connections and workflows to automate the data fetching process. API documentation is available at [login to view URL] Access to the demo IDoSell store will be provided. Requirements for the Freelancer: Azure Access: I do not provide access to Azure; the selected contractor must have the ability to provision Azure Data Factory. Implementation: Develop a working solution within Azure Data Factory to retrieve all orders from the specified endpoint, iterating through pages using "resultsPage" and "resultsLimit" parameters. Output should be saved to Data Lake as JSON file/files. Objective: Provide a detailed explanation, including screenshots, enabling me to recreate the solution in my environment.
Project ID: 37522383

About the project

6 proposals
Remote project
Active 5 mos ago

Looking to make some money?

Benefits of bidding on Freelancer

Set your budget and timeframe
Get paid for your work
Outline your proposal
It's free to sign up and bid on jobs
Awarded to:
User Avatar
Hello, I am writing to express my interest in Azure data project. The key responsibilities align perfectly with my expertise and experience in Azure data technologies and data engineering. I am confident that I can contribute effectively to the success of your data platform. I have a solid track record of 6 years in designing and implementing key components and technologies for data platforms. My experience includes building scalable data solutions using Azure services such as Azure Data Factory, Python, Azure Databricks, and Azure Synapse Analytics. I have 10 years of solid experience in ETL/ELT processes and have successfully implemented data pipelines to extract, transform, and load data from various sources including API into data warehouses and data lakes. I hold a Microsoft certification as an Azure data engineer, which further validates my expertise in this domain. I am well-versed in various technologies and tools that are essential for successful project execution. Whether it's designing and managing databases, writing efficient SQL queries, leveraging Azure services, creating robust ETL pipelines, or developing insightful data visualizations, I am equipped with the necessary skills to deliver exceptional results. Thank you for considering my proposal, and I look forward to the possibility of working with you. Sincerely, Praveen
$30 CAD in 7 days
5.0 (2 reviews)
2.8
2.8
6 freelancers are bidding on average $27 CAD for this job
User Avatar
Hello, I hope you are well and reading this message. After looking over your job description, I think it is totally doable. by using "For Each" activity in Azure Data Factory, I will iterate through pages (results Page and results Limit) parameters and I have expertise in data flow in azure data Lake. Having worked as an Azure developer for more than ten years, I have finished projects of this nature. I have expertise in creating REST APIs. my skills: - MS Azure Services. - Developing, Deploying and configuring Azure Web Apps, Azure App Services using Visual Studio, experience in cloud migration and integration. - Azure Active Directory, ADFS, OAuth, SAML for authentication/Authorization. - Building solutions using Event Hub, Service Bus, Blob Storage, Azure Files, Table Storage. - Developing and Hosting REST API using Azure API Management, Swagger and integration with Azure web apps. - Designing, hosting, and managing data in SQL Azure, Document DB, Mongo DB, Table Storage, Azure Data Lake, and blob storage I am eager to discuss how my skills align with your project's needs and contribute to the success of your requirements. Looking forward to the opportunity for further discussion. Warm regards, Imtiyaz
$30 CAD in 7 days
2.4 (2 reviews)
3.2
3.2
User Avatar
Greetings Dear Client, Welcome to my profile, Home to Professional and Quality services with 100% customer satisfaction guarantee. I'm a Certified & Experienced Expert in the respective project requirements. Dear Client, I take this opportunity to inform you that i have KEENLY gone through all your project requirements as given in your project description and I confirm to you that i can perfectly deliver as instructed. Being in possession of all CLEARLY STATED required project skills (Azure, ETL, JSON and Microsoft Azure) as this is my area of professional specialisation having completed all Certifications and developed adequate experience in the same area, I hereby humbly request you to consider my bid for Professional, Quality and Affordable services always. STRICT TIMELY DELIVERY & UNLIMITED REVISIONS. Kindly Message Me We Discuss More About The Project and seal the contract. Welcome and Thank-you.
$30 CAD in 1 day
0.0 (0 reviews)
0.0
0.0
User Avatar
I used to work in the Microsoft ADF team. Master product usage and pipeline performance tuning. Familiarity with Azure platform (network, RBAC) and services (Blob, Sql, Databricks, etc). Master in Python and REST API, understand pagination rules in ADF. (configure "resultsPage" and "resultsLimit" parameters in ADF) Can combine with Azure function or notebooks to implement data processing functions that are not supported by ADF.
$30 CAD in 7 days
0.0 (0 reviews)
0.0
0.0
User Avatar
I have more than 10 years of experience in data engineer and pipeline designing and automation. Recently automated carbon footprint sustainability pipeline for azure which requires odata api fetch from Microsoft's sustainability api using azure data factory and used different parameters such as datekey to filter & limit the data and pagination i.e fetching the next link automatically using the copy data activity in data factory. Which looks very similar to what you are expecting. I can quickly implement and explain it to you. Looking forward for your response...
$30 CAD in 5 days
0.0 (0 reviews)
0.0
0.0

About the client

Flag of POLAND
Otwock, Poland
5.0
1
Payment method verified
Member since Oct 1, 2020

Client Verification

Thanks! We’ve emailed you a link to claim your free credit.
Something went wrong while sending your email. Please try again.
Registered Users Total Jobs Posted
Freelancer ® is a registered Trademark of Freelancer Technology Pty Limited (ACN 142 189 759)
Copyright © 2024 Freelancer Technology Pty Limited (ACN 142 189 759)
Loading preview
Permission granted for Geolocation.
Your login session has expired and you have been logged out. Please log in again.