1
Professional skills
• Familiar with Scala, have two years working experience in Scala.
• 3+ years in Linux operating system, Shell command,
• 1+ year experience in Spark distributed computing platform, including SparkSql, • • SparkStreaming development,
• 1+ year experience in Akka framework development
Responsibilities:
• Develop, deploy, test our Hadoop framework (Spark, HDFS, Hbase, Yarn);
• Develop the API based on Akka framework;
• Cleaning the data and processing the structure data;
• Write the requirements of system function;
• Cooperate with outsourcing companies to implement the system.
Soft requirements
• Have natural language processing, machine learning, deep learning experience is preferred
• Upper Intermediate/Advanced English level (We don’t expect you to know Chinese but after working with us you will be able to speak it a little bit)
What's in it for you?
• Competitive remuneration packages
• Work schedule 5\2 from9:00-18:00. Offer paid travel.
• Office in the financial center of Shanghai, right by the Metro station Central Avenue.

site company:https://www.buyint.com/
Linked in:https://www.linkedin.com/groups/10347149?lipi=urn:li:..
e-mail:xiaolujuan@mergeralliance.org
2017.09.15
Тема Ответить