deeplearn001.pdf


立即下载 流浪海。
2024-04-17
gcForest dee neural net works performance hyper-parameter effort great re-quire
2.6 MB

Deep Forest: Towards an Alternative to Deep Neural Networks∗
Zhi-Hua Zhou and Ji Feng
National Key Lab for Novel Software Technology, Nanjing University, Nanjing 210023, China
{zhouzh, fengj}@lamda.nju.edu.cn
Abstract
In this paper, we propose gcForest, a decision tree
ensemble approach with performance highly com-
petitive to deep neural networks in a broad range of
tasks. In contrast to deep neural networks which re-
quire great effort in hyper-parameter tuning, gcFor-
est is much easier to train; even when it is applied
to different data across different domains in our ex-
periments, excellent performance can be achieved
by almost same settings of hyper-parameters. The
training process of gcForest is efficient, and users
can control training cost according to computa-
tional resource available. The efficiency may be
further enhanced because gcForest is naturally apt
to parallel implementation. Furthermore, in con-
trast to deep neural networks which require large-


gcForest/dee/neural/net/works/performance/hyper-parameter/effort/great/re-quire/ gcForest/dee/neural/net/works/performance/hyper-parameter/effort/great/re-quire/
-1 条回复
登录 后才能参与评论
-->