Loading docs/zh_CN/后端开发文档.md +7 −0 Original line number Diff line number Diff line # 后端开发文档 ## 环境要求 * [Mysql](http://geek.analysys.cn/topic/124) (5.5+) : 必装 * [JDK](https://www.oracle.com/technetwork/java/javase/downloads/index.html) (1.8+) : 必装 * [ZooKeeper](https://mirrors.tuna.tsinghua.edu.cn/apache/zookeeper)(3.4.6+) :必装 * [Maven](http://maven.apache.org/download.cgi)(3.3+) :必装 因EasyScheduler中escheduler-rpc模块使用到Grpc,需要用到Maven编译生成所需要的类 ## 项目编译 Loading docs/zh_CN/后端部署文档.md +1 −1 Original line number Diff line number Diff line Loading @@ -10,7 +10,7 @@ * [Mysql](http://geek.analysys.cn/topic/124) (5.5+) : 必装 * [JDK](https://www.oracle.com/technetwork/java/javase/downloads/index.html) (1.8+) : 必装 * [ZooKeeper](https://www.jianshu.com/p/de90172ea680)(3.4.6) :必装 * [ZooKeeper](https://www.jianshu.com/p/de90172ea680)(3.4.6+) :必装 * [Hadoop](https://blog.csdn.net/Evankaka/article/details/51612437)(2.6+) :选装, 如果需要使用到资源上传功能,MapReduce任务提交则需要配置Hadoop(上传的资源文件目前保存在Hdfs上) * [Hive](https://staroon.pro/2017/12/09/HiveInstall/)(1.2.1) : 选装,hive任务提交需要安装 * Spark(1.x,2.x) : 选装,Spark任务提交需要安装 Loading Loading
docs/zh_CN/后端开发文档.md +7 −0 Original line number Diff line number Diff line # 后端开发文档 ## 环境要求 * [Mysql](http://geek.analysys.cn/topic/124) (5.5+) : 必装 * [JDK](https://www.oracle.com/technetwork/java/javase/downloads/index.html) (1.8+) : 必装 * [ZooKeeper](https://mirrors.tuna.tsinghua.edu.cn/apache/zookeeper)(3.4.6+) :必装 * [Maven](http://maven.apache.org/download.cgi)(3.3+) :必装 因EasyScheduler中escheduler-rpc模块使用到Grpc,需要用到Maven编译生成所需要的类 ## 项目编译 Loading
docs/zh_CN/后端部署文档.md +1 −1 Original line number Diff line number Diff line Loading @@ -10,7 +10,7 @@ * [Mysql](http://geek.analysys.cn/topic/124) (5.5+) : 必装 * [JDK](https://www.oracle.com/technetwork/java/javase/downloads/index.html) (1.8+) : 必装 * [ZooKeeper](https://www.jianshu.com/p/de90172ea680)(3.4.6) :必装 * [ZooKeeper](https://www.jianshu.com/p/de90172ea680)(3.4.6+) :必装 * [Hadoop](https://blog.csdn.net/Evankaka/article/details/51612437)(2.6+) :选装, 如果需要使用到资源上传功能,MapReduce任务提交则需要配置Hadoop(上传的资源文件目前保存在Hdfs上) * [Hive](https://staroon.pro/2017/12/09/HiveInstall/)(1.2.1) : 选装,hive任务提交需要安装 * Spark(1.x,2.x) : 选装,Spark任务提交需要安装 Loading