BigData, Hive, Scala, Spark, HDFS , GCP with Java knowledge
Big data developer provides care and feeding of our Big Data environments and its interfaces built upon technologies in the Hadoop Ecosystem including Hive, Hbase, Spark, and Kafka.
Collaborate with like-minded team members to establish best practices, identify optimal technical solutions (20%)
Reviewing code and providing feedback relative to best practices, improving performance
Design, develop and test a large-scale, custom distributed software system using latest Java, Scala and Big data technologies
Adhering to the appropriate SDLC and Agile practices
Actively contribute to the technological strategy definition (design, architecture and interfaces) in order to effectively respond to our clients business needs
Participate in technological watch and the definition of standards to ensure that our systems and data warehouses are efficient, resilient and durable
Able to provide guidance and coaching to associate software developers
Experienced in using Informatica or similar products, with an understanding of heterogeneous data replication techniques