-
Notifications
You must be signed in to change notification settings - Fork 23
other ideas
Sam Tomioka edited this page Feb 6, 2019
·
4 revisions
ID | Ideas | Name | Date |
---|---|---|---|
001 | Use Universal Sentence Encoder+ML Reference: Daniel Cer, Yinfei Yang, Sheng-yi Kong, Nan Hua, Nicole Limtiaco, Rhomni St. John, Noah Constant, Mario Guajardo-Céspedes, Steve Yuan, Chris Tar, Yun-Hsuan Sung, Brian Strope, Ray Kurzweil. Universal Sentence Encoder. arXiv:1803.11175, 2018 | Sam Tomioka | |
002 | Build word embedding from CDISC documentation | Sam Tomioka | |
003 | Graph approach? Structure Learning, Graph Convolutional Network | Sam Tomioka | |
004 | Build conventions for output |
Sam Tomioka | |
005 | Maybe do multi step approaches rather than end-to-end where the first is to classify the domains, then classify drop or keep, then classify them into SDTM variables so we could reduce the amount of training data? | Sam Tomioka | |
006 | Collect metadata of raw variables used for SDTM | Sam Tomioka |