As a Big Data & Java Full Stack Developer you will be responsible for the following :
- Perform data ingestions, writing of MapReduce jobs.
- Design, develop, enhance, debug and implement J2EE, Angular, Spring based applications.
- Perform application requirement analysis and estimation of new requirements..
- Address problems of system integration, compatibility, and multiple platforms and defects encountered in System Testing and UAT.
- Work with Project Manager/ Business Analyst to gather the requirements of user stories with client.
- Develop and deliver the artifacts in Agile methodology
Technical Skills Required :
- Experience with the Hadoop ecosystem and Big Data technologies using Java/J2EE
- Knowledge of design strategies for developing scalable, resilient data lake - Data storage, partitioning, splitting, file types (Parquet, Avro, ORC).
- Hadoop ecosystem - HDFS, MapReduce, HBase, Hive, Pig, Sqoop
- Strong knowledge of SQL, MySQL.
- Front end technologies - HTML/CSS, JQuery, JavaScript, Angular, Bootstrap
- Back end technologies - JPA or Hibernate.
Nice To Have skills :
- One or more data ingestion frameworks such as Kafka, Storm, Nifi.
- Knowledge of Impala, MongoDB.
- Knowledge of Scala, Python.
- Public cloud (AWS/Azure/GCP) Hadoop cluster experience.
- Google BigQuery, Cloud Dataflow/Apache Beam
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.