II. SKILL AND EXPERIENCE REQUIRED:
- 7-10+ years of technical experience building newly configured and designed data-centric software solutions
- Advance level knowledge and use of Java 8+ experience using Multithreading, Collections, Streams, API, and functional programming, working on real enterprise projects.
- Minimum of one year of working experience in develcloud-native native streaming applications using Kafka, Kafka Streams, and Spring Framework.
- Hands-on experience with high-speed speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc.
- Some hands-on experience with distributed message broker, such as Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc.
- Hands-on experience with AWS foundational services like VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc. and experience with Big Data architectures and BI solutions
These are plusses to have
Intermediate working knowledge of DevOps tools Terraform, Ansible, Jenkins, Maven/Gradle, Nexus/Artifactory, and CI/CD pipeline etc.
Comprehensive debugging and troubleshooting skills, resourcefulness and strong researching skills
Proficiency and demonstrated skills in both Oral and Written business communications
Please let us know if you have any questions