Data Mining and Knowledge Discovery Handbook, 2 Edition part 54. Knowledge Discovery demonstrates intelligent computing at its best, and is the most desirable and interesting end-product of Information Technology. To be able to discover and to extract knowledge from data is a task that many researchers and practitioners are endeavoring to accomplish. There is a lot of hidden knowledge waiting to be discovered – this is the challenge created by today’s abundance of data. Data Mining and Knowledge Discovery Handbook, 2nd Edition organizes the most current concepts, theories, standards, methodologies, trends, challenges and applications of data mining (DM) and knowledge discovery. | 510 Lior Rokach There are several algorithms for induction of fuzzy decision trees most of them extend existing decision trees methods. The UR-ID3 algorithm Maher and Clair 1993 starts by building a strict decision tree and subsequently fuzzifies the conditions of the tree. Tani and Sakoda 1992 use the ID3 algorithm to select effective numerical attributes. The obtained splitting intervals are used as fuzzy boundaries. Regression is then used in each subspace to form fuzzy rules. Cios and Sztandera 1992 use the ID3 algorithm to convert a decision tree into a layer of a feedforward neural network. Each neuron is represented as a hyperplane with a fuzzy boundary. The nodes within the hidden layer are generated until some fuzzy entropy is reduced to zero. New hidden layers are generated until there is only one node at the output layer. Fuzzy-CART Jang 1994 is a method which uses the CART algorithm to build a tree. However the tree which is the first step is only used to propose fuzzy sets of the continuous domains using the generated thresholds . Then a layered network algorithm is employed to learn fuzzy rules. This produces more comprehensible fuzzy rules and improves the CART s initial results. Another complete framework for building a fuzzy tree including several inference procedures based on conflict resolution in rule-based systems and efficient approximate reasoning methods was presented in Janikow 1998 . Olaru and Wehenkel 2003 presented a new type of fuzzy decision trees called soft decision trees SDT . This approach combines tree-growing and pruning to determine the structure of the soft decision tree. Refitting and backfitting are used to improve its generalization capabilities. The researchers empirically showed that soft decision trees are significantly more accurate than standard decision trees. Moreover a global model variance study shows a much lower variance for soft decision trees than for standard trees as a direct cause of the improved accuracy. .