Responsibilities:

– Build large-scale data processing systems

– Involvement in consulting projects, such as technology assessment and evaluation; deliver discovery reports or technical solution vision etc.

– R&D and rapid prototyping

– Involvement in long-term projects with the possibility to learn new technologies and assimilate best practices from experts

– Engagement of new clients and presale support

– Contribute to SS marketing programs by publishing tech articles and blogs in the media

 

Requirements:

– Extensive knowledge in at least one of the following programming languages: Java, Scala, Python, Ruby, JavaScript, etc.

– Data processing and computation frameworks: Kafka Streams, Storm, Spark, Flink, Beam, Akka, etc.

– Experience in stream processing, batch processing and data integration from multiple data sources

– Good understanding of distributed computing, microservices and Lambda Architecture principles

– Cloud platforms: AWS, Google Cloud or Azure

– Hadoop stack: YARN, HDFS, Pig, Hive, Flume, etc.

Share