The paper proposes an information-theorybased method for feature types analysis in probabilistic evaluation modelling for statistical parsing. The basic idea is that we use entropy and conditional entropy to measure whether a feature type grasps some of the information for syntactic structure prediction. Our experiment quantitatively analyzes several feature types’ power for syntactic structure prediction and draws a series of interesting conclusions.