PySpark is a Python library that makes it easy to write applications that process data in Apache Spark. Using PySpark, you can write richer and more powerful data processing programs using the skills you already have with Python.

Hire PySpark Experts

Filter

My recent searches
Filter by:
Budget
to
to
to
Type
Skills
Languages
    Job State
    4 jobs found, pricing in GBP

    It’s pyspark code, we need to optimise it

    £14 / hr (Avg Bid)
    £14 / hr Avg Bid
    6 bids

    There are few R scripts that need to be converted to Pyspark. The work is almost done and need some logic to be implemented and tested out.

    £145 (Avg Bid)
    £145 Avg Bid
    12 bids

    Expert level knowledge in AWS services like EMR, S3. Extensive experience in python, pyspark, Hadoop, hue, presto, bash shell script Expert in Apache airflow, creating and troubleshooting DAGs Good troubleshooting skills Good experience with Talend Experience with Vertica

    £575 (Avg Bid)
    £575 Avg Bid
    6 bids

    Looking for an Azure Data Engineer from India for our Enterprise Client (INDIVIDUALS ONLY). Teams/Enterprises/Consultancies/Agencies please stay away. Project Duration: 3-6 months Location: Remote/WFH. Hours Required: 40 hrs/week Responsibilities • Design Azure Data Lake solution, partition strategy for files • Explore and load data from structured and semi-structured data sources into ADLS and Azure Blob Storage • Ingest and transform data using Azure Data Factory • Ingest and transform data using Azure Databricks and PySpark • Design and build data engineering pipelines using Azure Data Factory • Implement data pipelines for full load and incremental data loads • Design and build error handling, data quality routines using Data factory a...

    £20 / hr (Avg Bid)
    £20 / hr Avg Bid
    20 bids