Will get a chance to work on the latest big data stack: NiFi, Spark, Flink, etc.
Not sure about the outlook.
The written code interview was followed by a use-case design discussion for change data capture approaches. We discussed HBase, Hive/Impala, and Java programming concepts. We also covered ETL concepts in both traditional and Hadoop contexts.
Quite straightforward. There were two seniors on the call who asked a lot of detailed questions about the architecture and the underlying framework. Additionally, there was a simulation of an ad hoc remote problem-solving session.
The process involves a few steps: * Conversations with HR. * Conversations with the technical area. * Conversations with your potential supervisors. It requires communication skills, proficiency in English, and technical expertise in the area, espe
The written code interview was followed by a use-case design discussion for change data capture approaches. We discussed HBase, Hive/Impala, and Java programming concepts. We also covered ETL concepts in both traditional and Hadoop contexts.
Quite straightforward. There were two seniors on the call who asked a lot of detailed questions about the architecture and the underlying framework. Additionally, there was a simulation of an ad hoc remote problem-solving session.
The process involves a few steps: * Conversations with HR. * Conversations with the technical area. * Conversations with your potential supervisors. It requires communication skills, proficiency in English, and technical expertise in the area, espe