pyspark.sql.functions.round#

pyspark.sql.functions.round(col, scale=None)[source]#

Round the given value to scale decimal places using HALF_UP rounding mode if scale >= 0 or at integral part when scale < 0.

New in version 1.5.0.

Changed in version 3.4.0: Supports Spark Connect.

Parameters
colColumn or str

The target column or column name to compute the round on.

scaleColumn or int, optional

An optional parameter to control the rounding behavior.

Changed in version 4.0.0: Support Column type.

Returns
Column

A column for the rounded value.

Examples

Example 1: Compute the rounded of a column value

>>> import pyspark.sql.functions as sf
>>> spark.range(1).select(sf.round(sf.lit(2.5))).show()
+-------------+
|round(2.5, 0)|
+-------------+
|          3.0|
+-------------+

Example 2: Compute the rounded of a column value with a specified scale

>>> import pyspark.sql.functions as sf
>>> spark.range(1).select(sf.round(sf.lit(2.1267), sf.lit(2))).show()
+----------------+
|round(2.1267, 2)|
+----------------+
|            2.13|
+----------------+