请选择 进入手机版 | 继续访问电脑版

Followmwdoit Forum

热搜关键字: 活动 求职 项目 CRM
 找回密码
 立即注册
查看: 2365|回复: 1

参加了大数据培训,找什么工作?

[复制链接]

71

主题

88

帖子

592

积分

管理员

Rank: 9Rank: 9Rank: 9

积分
592
发表于 2016-5-1 21:20:26 | 显示全部楼层 |阅读模式
今天培训完后,有学员问以上问题。先看看咱们到现在都培训了什么。

- Hadoop and Hortonworks (HDFS, MapReduce, YARN, Hive, Pig, Sqoop, etc.).        

- BI (ETL, Data Mining, Tableau, PowerPivot/View/Map, etc.)                                    

在没有学Spark之前,那么市场上都有什么工作等着大家呢?你可以到Indeed上搜一搜,Hadoop Developer或是Big Data Developer可能是很好的match。


这是一个很好的例子: Senior Hadoop Developer from Adecco
Roevin IT/Adecco's Client: A top-notch multi-national corporation servicing the Banking Industry (为银行找人,估计是5大)
Job title: Senior Hadoop Developer
Current Openings: 4-5
Job location: Toronto, Ontario M5V 3K7/ M3C 0C1
Job type: Full-time, Permanent


什么要求?


Job Responsibilities for the Sr. Hadoop Developer:
- Design and Build distributed, scalable, and reliable data pipelines that ingest and process data at scale and in real-time.
- Collaborate with other teams to design and develop data tools that support both operations like data quality and product use cases.
- Source huge volume of data from diversified data platforms into Hadoop platform
- Perform offline analysis of large data sets using components from the Hadoop ecosystem.
- Evaluate big data technologies and prototype solutions to improve our data processing architecture.
- Knowledge of Banking domain is an added advantage

Desired Candidate Profile for the Sr. Hadoop Developer:
- 7+ years of hands-on programming experience with 3+ years in Hadoop platform (估计很少有人有这些年的经验,所以这是唬人的)
- Experience developing Hadoop based platforms for building Data Lakes and implementing inline or offline Data quality capabilities
- Knowledge of various components of Hadoop ecosystem and experience in applying them to practical problems
- Proficiency with Java and one of the scripting languages like Python / Scala etc.
- Flair for data, schema, data model, how to bring efficiency in big data related life cycle
- Experience building ETL frameworks in Hadoop using Pig/Hive/Map reduce/ Data Torrent
- Experience in creating custom UDFs and custom input/output formats / serdes
- Ability to acquire, compute, store and provision various types of datasets in Hadoop platform
- Understanding of various Visualization platforms (Tableau, Qlikview, others)
- Experience in data warehousing, ETL tools , MPP database systems
- Strong object-oriented design and analysis skills
- Excellent technical and organizational skills
- Excellent written and verbal communication skills

Top skill sets / technologies desired in the Sr. Hadoop Developer:
- Java / Python / Scala
- Unix/ETL /DATAWAREHOUSE/SQL knowledge
- Sqoop/Flume/Kafka/Pig/Hive/(Talend or Pentaho or Informatica or similar ETL) / HBase / NoSQL / MapReduce/Spark
- Experience on DataTorrent is an advantage.
- Data Integration/Data Management/Data Visualization experience


这是一个Senior的工作,但是大家从中可以体会到这类工作的主要技术要求。


回复

使用道具 举报

116

主题

334

帖子

1463

积分

管理员

Rank: 9Rank: 9Rank: 9

积分
1463
发表于 2016-5-2 21:38:55 | 显示全部楼层
太好的样板,求职大数据资深职位!
回复 支持 反对

使用道具 举报

高级模式
B Color Image Link Quote Code Smilies

本版积分规则

Powered by Discuz! X3.2 © 2001-2013 Comsenz Inc.

Follow Me Do IT

快速回复 返回顶部 返回列表