Đang chuẩn bị liên kết để tải về tài liệu:
Luận án Tiến sĩ Khoa học máy tính: Advanced deep learning models and applications in semantic relation extraction

Không đóng trình duyệt đến khi xuất hiện nút TẢI XUỐNG

Furthermore, experiments on the task of RE proved that data representation is one of the most influential factors to the model’s performance but still has many limitations. We propose a compositional embedding that combines several dominant linguistic as well as architectural features and dependency tree normalization techniques for generating rich representations for both words and dependency relations in the SDP | Luận án Tiến sĩ Khoa học máy tính Advanced deep learning models and applications in semantic relation extraction VIETNAM NATIONAL UNIVERSITY HANOI UNIVERSITY OF ENGINEERING AND TECHNOLOGY CAN DUY CAT ADVANCED DEEP LEARNING MODELS AND APPLICATIONS IN SEMANTIC RELATION EXTRACTION MASTER THESIS Major Computer Science HA NOI - 2019 VIETNAM NATIONAL UNIVERSITY HANOI UNIVERSITY OF ENGINEERING AND TECHNOLOGY Can Duy Cat ADVANCED DEEP LEARNING MODELS AND APPLICATIONS IN SEMANTIC RELATION EXTRACTION MASTER THESIS Major Computer Science Supervisor Assoc.Prof. Ha Quang Thuy Assoc.Prof. Chng Eng Siong HA NOI - 2019 Abstract Relation Extraction RE is one of the most fundamental task of Natural Language Pro- cessing NLP and Information Extraction IE . To extract the relationship between two entities in a sentence two common approaches are 1 using their shortest dependency path SDP and 2 using an attention model to capture a context-based representation of the sentence. Each approach suffers from its own disadvantage of either missing or redundant information. In this work we propose a novel model that combines the ad- vantages of these two approaches. This is based on the basic information in the SDP enhanced with information selected by several attention mechanisms with kernel filters namely RbSP Richer-but-Smarter SDP . To exploit the representation behind the RbSP structure effectively we develop a combined Deep Neural Network DNN with a Long Short-Term Memory LSTM network on word sequences and a Convolutional Neural Network CNN on RbSP. Furthermore experiments on the task of RE proved that data representation is one of the most influential factors to the model s performance but still has many limitations. We propose i a compositional embedding that combines several dominant linguistic as well as architectural features and ii dependency tree normalization techniques for generating rich representations for both words and dependency relations in the SDP. Experimental results

TÀI LIỆU LIÊN QUAN
Đã phát hiện trình chặn quảng cáo AdBlock
Trang web này phụ thuộc vào doanh thu từ số lần hiển thị quảng cáo để tồn tại. Vui lòng tắt trình chặn quảng cáo của bạn hoặc tạm dừng tính năng chặn quảng cáo cho trang web này.