How to Concatenate Two PySpark DataFrames Efficiently?

Concatenating DataFrames is a common task in data processing pipelines. In PySpark, you can use the `union` method to concatenate DataFrames efficiently. Below is a detailed explanation along with a code snippet demonstrating the process.

Concatenating Two PySpark DataFrames

In PySpark, the `union` method allows you to concatenate DataFrames. For this method to work, the DataFrames must have the same schema. If they don’t, you need to either rename or cast columns to match the schema.

Step-by-Step Guide

1. **Create the DataFrames**: Start by creating two DataFrames with the same schema.
2. **Union the DataFrames**: Use the `union` method to concatenate them.
3. **Show the Result**: Display the concatenated DataFrame.

Example

Here is a code example using PySpark:


from pyspark.sql import SparkSession
from pyspark.sql.functions import lit

# Create Spark session
spark = SparkSession.builder.appName("ConcatDataFrames").getOrCreate()

# Create DataFrame 1
data1 = [("John", 28), ("Anna", 23)]
columns1 = ["Name", "Age"]
df1 = spark.createDataFrame(data1, columns1)

# Create DataFrame 2
data2 = [("Mike", 35), ("Sara", 29)]
columns2 = ["Name", "Age"]
df2 = spark.createDataFrame(data2, columns2)

# Concatenate DataFrames using union
df_concatenated = df1.union(df2)

# Show the concatenated DataFrame
df_concatenated.show()

Output


+----+---+
|Name|Age|
+----+---+
|John| 28|
|Anna| 23|
|Mike| 35|
|Sara| 29|
+----+---+

Important Considerations

1. **Schema Matching**: Both DataFrames must have the same schema for the `union` operation to work.
2. **Performance**: While PySpark’s `union` is efficient, the actual performance can depend on various factors like partitioning and data size.

By following the above steps, you can efficiently concatenate two PySpark DataFrames.

About Editorial Team

Our Editorial Team is made up of tech enthusiasts deeply skilled in Apache Spark, PySpark, and Machine Learning, alongside proficiency in Pandas, R, Hive, PostgreSQL, Snowflake, and Databricks. They're not just experts; they're passionate educators, dedicated to demystifying complex data concepts through engaging and easy-to-understand tutorials.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top