Java Spark Developer Jobs
By Tanisha Systems, Inc At Plano, TX, United States
Required qualifications, capabilities, and skills:
Formal training or certification on software engineering concepts and 5+ years applied experience.
Hands-on practical experience delivering system design, application development, testing, and operational stability.
Advanced in one or more programming language(s) - Java 8+ (experience with lambdas and streams)
Hands-on experience in Spark leveraging Java 8
In-depth knowledge of the financial services industry and their IT systems
Spark Lead Jobs
By NLB Services At Irving, TX, United States
· Strong hands-on experience in shell/python scripting language and good understanding of Linux/Unix.
· Good to have knowledge on Sensu
· Hands on experience in Jenkins , RLM, Ansible
· Good to have knowledge and understanding of K8s.
Location: Irving Tx(Day one onsite role)
· Strong understanding of SRE concept and practices with automation implementation.

Are you looking for an exciting opportunity to work with Apache Spark and help shape the future of big data analytics? We are looking for a talented engineer to join our team and help us build the next generation of data processing and analytics solutions. If you have a passion for working with large datasets and a desire to make a real impact, then this is the job for you!

Overview:

Apache Spark is an open-source distributed computing framework used for big data processing, analytics, and machine learning. It is designed to provide a unified platform for data processing and analytics, and to enable developers to quickly and easily build applications that can scale up to handle large datasets.

Detailed Job Description:

An Apache Spark job typically involves developing and deploying applications that use the Spark framework. This includes writing code in languages such as Scala, Java, and Python, as well as configuring and managing the Spark cluster. The job may also involve developing and deploying applications that use other big data technologies such as Hadoop, Cassandra, and Kafka.

What is Apache Spark Skills Required?

• Proficiency in programming languages such as Scala, Java, and Python
• Knowledge of distributed computing frameworks such as Apache Spark and Hadoop
• Knowledge of big data technologies such as Cassandra, Kafka, and Elasticsearch
• Experience with data processing and analytics
• Ability to debug and troubleshoot applications
• Knowledge of cloud computing platforms such as AWS and Azure

What is Apache Spark Qualifications?

• Bachelor’s degree in computer science, engineering, or a related field
• Experience with distributed computing frameworks such as Apache Spark and Hadoop
• Experience with big data technologies such as Cassandra, Kafka, and Elasticsearch
• Knowledge of programming languages such as Scala, Java, and Python
• Knowledge of cloud computing platforms such as AWS and Azure

What is Apache Spark Knowledge?

• Knowledge of distributed computing frameworks such as Apache Spark and Hadoop
• Knowledge of big data technologies such as Cassandra, Kafka, and Elasticsearch
• Knowledge of programming languages such as Scala, Java, and Python
• Knowledge of cloud computing platforms such as AWS and Azure

What is Apache Spark Experience?

• Experience with distributed computing frameworks such as Apache Spark and Hadoop
• Experience with big data technologies such as Cassandra, Kafka, and Elasticsearch
• Experience with data processing and analytics
• Experience with programming languages such as Scala, Java, and Python
• Experience with cloud computing platforms such as AWS and Azure

What is Apache Spark Responsibilities?

• Develop and deploy applications using the Spark framework
• Configure and manage the Spark cluster
• Develop and deploy applications using other big data technologies such as Hadoop, Cassandra, and Kafka
• Debug and troubleshoot applications
• Monitor and optimize application performance
• Collaborate with other teams to ensure successful deployment