sign in board
No. Title Authors Journal
157 Transformer Journal Club . . ():
Abstract
/Date : 2023/06/26

1. 조석우
- A bidirectional encoder representation from BERT-based model for improving the prediction of bitter peptides from the original amino acid sequence
- Prediction of lysine crotonylation sites by a transfer learning method with pre-trained BERT models

2. 김한진
- A gap-aware transformer-encoder for sequence correction trained by an alignment-based loss
- Prediction of interactions between regulatory elements by pre-training large-scale genomic data in a multimodal and a self-supervised manner

3. 이성권
- A portmanteau of enhancer and transformer to predict gene expression and chromatin states from DNA seguences
- Transtormer for the gene expression-based classification of lung cancer subtypes that solved the complexity of high-dimensional gene expression through a multi-headed self-attention module

4. 김성민
- A high-throughput Transformer-based protein function annotator with both accuracy and generalizability
- Learning protein biological structure and function from Unief dataset using pre-trained Transformer

5. 김영범
- Prediction of multiple cancer phenotypes based onsomatic genomic alterations via the genomic impact transformer
- Utilizing the heterogeneous graph transformer framework to infer cell type-specific single-cell biological networks

6. 조동혁
- The first pre-trained biomedical language representation model for biomedical text mining
- A powerful alternative to mainstream medical image segmentation methods that combined transformer and U-Net

7. 이정우
- An end-to-end deep transformer-based learning model that used cancer cell transcriptome information and chemical substructures of drugs to predict drug response
- A de novo drug generation model based on Transformer architecture