...knowledge).
~ Knowledge in extraction, transformation, and loading (ETL) tools.
~ Knowledge in big data tools such as MongoDB, Kafka, and Hadoop.
~ Experience in automation and scripting.
~ Precision and excellent attention to detail.
~ Analytical and problem-solving...
...Development leveraging Python, Java, Scala or similar, and Data Warehousing.
~ Strong experience using big data technologies (Postgresql, Hadoop, Hive, Hbase, Spark, etc.).
~ Experience working with data streaming technologies (Kafka, Spark Streaming, etc.).
~ Knowledge and...
...includes: Apache Mesos, Kubernetes, HAProxy, Nginx, Consul, Percona XtraDB cluster, RabbitMQ, Apache Kafka, ELK, Prometheus/Grafana, Hadoop/HBase, ProxySQL and others.
As a Senior DevOps Engineer you’ll be working closely with developers to transform and migrate to our next...
...problem-solving skills.
~ Excellent communication skills, both verbal and written.
~ Experience with big data technologies such as Hadoop, Spark, or Hive is a plus.
~ Experience in the financial industry is a plus.
Benefits
Attractive remuneration package...
...Expertise and hands-on experience using and managing databases, including knowledge of advanced databases, such as: Map Reduce, Hadoop, Spark, Flume, Hive, Impala, Spark SQL, BigQuery , etc
Advanced working knowledge of Linux, Web 2.0 development platform, solutions...
...Scientist, with a strong portfolio of successful projects.
Experience in database interrogation and analysis tools, such as Apache Hadoop, SQL and SAS
Proficient in programming languages such as Python and R.
Excellent communication skills to convey complex...
...version control systems, bug trackers.
Willingness to adopt new technologies, testing methodologies and frameworks.
Experience in Apache Hadoop, Spark,
Pandas, Tableau, PowerBI would be a plus.
Good command in English.
*Please send in your resume in English.
WHO WE ARE:...
...processing
You deal with large structured and unstructured datasets (1M rows, 1TB of various document types) → Buzzwords: Spark, MapReduce, Hadoop, Distributed Computing, Parquet (Data format)
You do testing and evaluation on the data setup
You support the Machine Learning...