Raised to power of column in pyspark – square, cube , square root and cube root in pyspark

Raised to the power column in pyspark can be accomplished using pow() function with argument column name followed by numeric value which is raised to the power. with the help of pow() function we will be able to find the square value of the column, cube of the column , square root and cube root of the column in pyspark. We will see Raised to power of column in pyspark with an example

  • Raised to power n of the column in pyspark with example
  • Square of the column in pyspark with example
  • Cube of the column in pyspark with example
  • Square root of the column in pyspark with example
  • Cube root of the column in pyspark with example

Syntax:

 pow(col1,n)

col1 – Column name
n – Raised power

We will be using df.

Raised to power of column in pyspark – square, cube , square root and cube root in pyspark 1

 

 

Square of the column in pyspark with example:

Pow() Function takes the column name and 2 as argument which calculates the square of the column in pyspark


## square of the column in pyspark
from pyspark.sql import Row
from pyspark.sql.functions import pow, col

df.select("*", pow(col("mathematics_score"), 2).alias("Math_score_square")).show()

In our example square of “mathematics_score” is calculated as shown below.

Raised to power of column in pyspark – square, cube , square root and cube root in pyspark 2

 

 

Cube of the column in pyspark with example:

Pow() Function takes the column name and 3 as argument which calculates the cube of the column in pyspark


## cube of the column in pyspark
from pyspark.sql import Row
from pyspark.sql.functions import pow, col

df.select("*", pow(col("mathematics_score"), 3).alias("Math_score_cube")).show()

In our example cube of “mathematics_score” is calculated as shown below.

Raised to power of column in pyspark – square, cube , square root and cube root in pyspark 3

 

 

Power of N to the column in pyspark with example:

Pow() Function takes the column name and N as argument which calculates the Nth power of the column in pyspark


## Power of N to the column in pyspark
from pyspark.sql import Row
from pyspark.sql.functions import pow, col

df.select("*", pow(col("mathematics_score"), 4).alias("Math_score_power")).show()

In our example power of 4 is raised to the column “mathematics_score” as shown below.

Raised to power of column in pyspark – square, cube , square root and cube root in pyspark 4

 

 

Cube root of the column in pyspark with example:

Pow() Function takes the column name and 1/3 as argument which calculates the cube root of the column in pyspark

## cube root of the column in pyspark

from pyspark.sql import Row
from pyspark.sql.functions import pow, col

df.select("*", pow(col("mathematics_score"), 1/3).alias("Math_score_cuberoot")).show()

In our example cube root of “mathematics_score” is calculated as shown below.

Raised to power of column in pyspark – square, cube , square root and cube root in pyspark 5

 

 

Square root of the column in pyspark with example:

Pow() Function takes the column name and 1/2 as argument which calculates the square root of the column in pyspark

## square root of the column in pyspark
from pyspark.sql import Row
from pyspark.sql.functions import pow, col

df.select("*", pow(col("mathematics_score"), 1/2).alias("Math_score_squareroot")).show()

In our example square root of “mathematics_score” is calculated as shown below.

Raised to power of column in pyspark – square, cube , square root and cube root in pyspark 6

 


Other Related Topics:

 

Raised to power of column in pyspark – square, cube , square root and cube root in pyspark                                                                                                        Raised to power of column in pyspark – square, cube , square root and cube root in pyspark

Author

  • Sridhar Venkatachalam

    With close to 10 years on Experience in data science and machine learning Have extensively worked on programming languages like R, Python (Pandas), SAS, Pyspark.

    View all posts