?Spark Engineer Jobs
By TanPro ITCorp At , Remote
Data Formats Experience – Parquet, CSV etc.
Azure, GCP knowledge is a plus
Big Data Hadoop, Spark, PySpark
Hands on Programming – Java, Scala, Python
AWS Cloud –S3, EFS, MSK, ECS, EMR,
Distributed Computing constructs – Joins,
Spark Software Developer- Remote
By Florida Blue At , Remote $72,300 - $117,500 a year
Experience writing complex SQL queries and data warehousing knowledge
3+ years software development coding experience
Hands-on coding experience in Spark
Experience working with different file formats including XML, Json, Parquet.
Experience developing code to process large volume data in batch and real time using Spark Streaming.
Experience building data flow and process flow diagrams

Are you an experienced Spark developer looking for a remote job that allows you to work from anywhere? We are looking for a talented individual to join our team and help us develop cutting-edge applications using Apache Spark. You will have the opportunity to work with the latest technologies and collaborate with a diverse team of professionals. If you are passionate about data engineering and have a knack for problem solving, this could be the perfect job for you!

Overview:

A Remote Spark Developer is a software engineer who specializes in developing applications using the Apache Spark framework. They are responsible for designing, developing, and maintaining applications that are built on the Spark platform. They must be familiar with the various components of the Spark framework, such as the Spark Core, Spark SQL, and Spark Streaming. They must also be able to work with other technologies such as Hadoop, Kafka, and Cassandra.

Detailed Job Description:

A Remote Spark Developer is responsible for designing, developing, and maintaining applications that are built on the Apache Spark framework. They must be able to work with the various components of the Spark framework, such as the Spark Core, Spark SQL, and Spark Streaming. They must also be able to work with other technologies such as Hadoop, Kafka, and Cassandra. They must be able to write code in Java, Scala, and Python. They must also be able to debug and troubleshoot applications. They must be able to work with distributed systems and understand the principles of distributed computing.

What is Remote Spark developer Job Skills Required?

• Proficiency in Java, Scala, and Python
• Knowledge of the Apache Spark framework
• Knowledge of Hadoop, Kafka, and Cassandra
• Knowledge of distributed systems and distributed computing
• Ability to debug and troubleshoot applications
• Ability to work independently and as part of a team

What is Remote Spark developer Job Qualifications?

• Bachelor’s degree in Computer Science or related field
• 5+ years of experience in software development
• Experience with distributed systems and distributed computing
• Experience with Apache Spark
• Experience with Hadoop, Kafka, and Cassandra

What is Remote Spark developer Job Knowledge?

• Knowledge of the Apache Spark framework
• Knowledge of Hadoop, Kafka, and Cassandra
• Knowledge of distributed systems and distributed computing
• Knowledge of Java, Scala, and Python

What is Remote Spark developer Job Experience?

• 5+ years of experience in software development
• Experience with distributed systems and distributed computing
• Experience with Apache Spark
• Experience with Hadoop, Kafka, and Cassandra

What is Remote Spark developer Job Responsibilities?

• Design, develop, and maintain applications built on the Apache Spark framework
• Work with the various components of the Spark framework
• Work with other technologies such as Hadoop, Kafka, and Cassandra
• Write code in Java, Scala, and Python
• Debug and troubleshoot applications
• Work with distributed systems and understand the