BEGIN:VCALENDAR VERSION:2.0 PRODID:-//Date iCal//NONSGML kigkonsult.se iCalcreator 2.20.4// METHOD:PUBLISH X-WR-CALNAME;VALUE=TEXT:ĚÇĐÄÔ­´´ BEGIN:VTIMEZONE TZID:America/New_York BEGIN:STANDARD DTSTART:20191103T020000 TZOFFSETFROM:-0400 TZOFFSETTO:-0500 TZNAME:EST END:STANDARD BEGIN:DAYLIGHT DTSTART:20190310T020000 TZOFFSETFROM:-0500 TZOFFSETTO:-0400 TZNAME:EDT END:DAYLIGHT END:VTIMEZONE BEGIN:VEVENT UID:calendar.361751.field_event_date.0@www.wright.edu DTSTAMP:20260220T001025Z CREATED:20190724T134540Z DESCRIPTION:Committee:  Drs. Pascal Hitzler\, Advisor\, Mateen Rizki\, and Yong PeiABSTRACT:In recent years\, the research in deep learning and knowl edge engineering has made a wide impact on the data and knowledge represen tations. The research in knowledge engineering has frequently focused on m odeling the high level human cognitive abilities\, such as reasoning\, mak ing inferences\, and validation.Semantic Web Technologies and Deep Learnin g have an interest in creating intelligent artifacts. Deep learning is a s et of machine learning algorithms that attempt to model data representatio ns through many layers of non-linear transformations. Deep learning is inc reasingly employed to analyze various knowledge representations mentioned in Semantic Web and provides better results for Semantic Web Reasoning and querying.Researchers at Data Semantic Laboratory (DaSe lab) have develope d a method to train a deep learning model which is based on End-to-End mem ory network over RDF knowledge graphs which can be able to perform reasoni ng over new RDF graph with the help of triple normalization with high prec ision and recall when compared to traditional deductive algorithms. Resear chers have also found out that it’s 40 times faster to train than the non- normalized model on a dataset which they have performed experiments on. Th ey have created efficient model capable of transferring its reasoning abil ity (by applying normalization) from one domain to another without any re/ pre-training or fine-tunning over new domain which constitutes Transfer le arning.In this thesis\, we are testing this Normalized embedding approach on the research which is done by Bassem Makni and James Hendler “Deep Lear ning for Noise-tolerant RDFS reasoning” [a]. The main limitation of their approach is that the training is done on a dataset that uses only one onto logy for the inference. In order to overcome this limitation\, we are prop osing transfer learning process by adding the normalization approach creat ed by DaSe Lab Researchers for reasoning over different ontologies/domains . DTSTART;TZID=America/New_York:20190729T100000 DTEND;TZID=America/New_York:20190729T120000 LAST-MODIFIED:20190724T141736Z LOCATION:304 Russ Engineering SUMMARY:Masters Thesis Defense “Transfer Learning for Knowledge Graph Reaso ning using Normalization” By BhargavaCharan Reddy Kaithi URL;TYPE=URI:/events/masters-thesis-defense-%E2%80%9C transfer-learning-knowledge-graph-reasoning-using-normalization%E2%80%9D END:VEVENT END:VCALENDAR