intern:vitransformer
This is an old revision of the document!
Table of Contents
Research Activities (August 22, 2024 – September 15, 2024)
Week log
Week 1: Foundational Research (August 22–27)
* Reviewed seminal papers on attention mechanisms, including:
- Attention Is All You Need
- Masked-attention Mask Transformer for Universal Image Segmentation
* Developed proficiency in Anaconda and Python programming for AI applications. * Joined laboratory research under mentorship of senior researchers.
Week 2: Collaborative Paper Development (August 28 – September 3)
* Supported experiments for the paper TGMformer: Transferability Guided Mask Transformer for Segmentation Domain Adaptation (led by senior Zhang Enming):
- Implemented baseline models comparing fine-tuning strategies (last layer vs. full network) for transfer learning.
- Formalized mathematical definitions for transfer learning, clustering algorithms, and attention masking in the Methods section.
- Assisted with LaTeX typesetting and document formatting.
Week 3: Data Pipeline Implementation (September 4–10)
- Continued working on the paper TGMformer: Transferability Guided Mask Transformer for Segmentation Domain Adaptation
- Engineered a data loader and task-splitting module for the ImageNet-R dataset to support experiments with the Hypernet framework.
- Debugged and optimized code to improve functional reliability.
Week 4: Analysis and Synthesis (September 11–15)
- Evaluated experimental results from the Hypernet project and documented insights.
- Authored a reflective summary highlighting growth in research skills, data processing, and software testing.
intern/vitransformer.1746380801.txt.gz · Last modified: 2025/05/04 13:46 by lizhengyu