Description:
Job DescriptionRequired Qualifications: All applicants authorized to work in the United States are encouraged to apply At least 4 years of experience with Information Technology. At least 2 years of experience in PySpark Strong understanding of distributed computing principles and big data technologies At least 2 years of experience working with Apache Spark and Spark SQL Knowledge of data serialization formats such as Parquet, Avro, or ORC Familiarity with data processing and transformation te
Aug 1, 2025;
from:
dice.com