Real-Time Back-End Software Engineer
We see a Real-Time Back-End Software Engineer as a skilled software engineer whose primary focus is large-scale real-time data processing. Sometimes it requires experience with special programming techniques (Lock free, Cache Friendly, Allocation Free, Thread-safe collections, etc), OS and hardware knowledge, experience with algorithms, which are quite a language agnostic. Two real-time programmers who are using different languages (Java and C++ for instance) will have more in common than a regular Java developer and a real-time one. Often it requires utilization of special classes of cloud services or open-source frameworks like message brokers, time-series DBMS, stream processing frameworks. It is also expected to have an understanding of throughput and latency tests building principles.
Here we provide a list of skills and technologies we are interested in. See for yourself, whether any of them are in your scope of expertise or interest. You do not have to master all of them. Tell us what your strong side is and we will find the right task for you. We value the thrive for knowledge and self-development within our team.
- Strong knowledge of at least one of Java/C++/C#/Go.
- Understanding of principles of REST API, server/client communication (WebSocket, SSE).
- Expertise in multi-threaded development and synchronization patterns.
- Understanding of the basic principles of developing distributed, performance-oriented systems.
- Fluent with git, build/CI systems and package management on a platform of a choice.
- Experience with time-series databases: InfluxDb, Prometheus, TimeScaleDb, or similar.
- Experience with message broker systems Kafka, Pulsar, GCP Pub/Sub, or similar.
- Experience with stream-processing systems Kafka Streams, Flink, or similar.
- Experience with real-time data cloud services on platform of a choice:
- AWS IoT Core, GCP IoT Core, Azure IoT Hub
- AWS Kinesis, GCP Pub/Sub, Azure Service Bus
- AWS Lambda, GCP Functions, GCP DataFlow, Azure Functions, Azure Streaming analytics
- AWS Athena, GCP BigTable, GCP BigQuery, Azure Data Lake Analytics
- Experience with application "dockerization", Kubernetes, helm.
- Experience with OpenID/OAuth integration.
- Experience with making architecture decisions
- Experience with high-performance DBMS like PostgreSql, ClickHouse, Aerospike, Rocksdb. Understanding of principles of selecting the best DB engine.
- Experience in non-low-level code optimization.
- Expertise in algorithms.
- Experience with low-level (IP, TCP, UDP (Multicast), Infiniband, RDMA, Solarflare).