Zhangyang (Atlas) Wang

Assistant Professor [CV] [Google Scholar] [GitHub]

  • Ph.D., ECE@UIUC, 2016; B.S., EEIS@USTC, 2012
  • Recent Honors: ARO Young Investigator Award (2020), AWS Machine Learning Research Award (2020), IBM Faculty Research Award (2020), TEES Young Faculty Fellow (2020), TAMU Engineering Genesis Award (2019)...

Department of Computer Science and Engineering

(Adjunct) Department of Electrical and Computer Engineering

Texas A&M University, College Station, TX

Address: 328C H.R. Bright Building, College Station, TX, 77843-3112

Email: atlaswang@tamu.edu

Most Recent Updates


  • 2020 (so far): Our group has published 3 ECCV, 6 ICML, 3 CVPR, 1 AISTATS, 1 ISCA, 1 IEEE TPAMI, 1 IEEE TIP, 1 IEEE TMC, and more. I also received an ARO Young Investigator Award (YIP), an IBM Faculty Research Award, an AWS ML Research Award, and a TEES Young Faculty Fellow.
  • 2019: Our group published 4 ICLR, 3 NeurIPS, 5 ICCV, 1 ICML, 2 CVPR, 1 MICCAI, 2 AAAI, 1 Bioinformatics, 1 IEEE TIP, 1 IEEE TMI, and more. We won 3rd prize of ICCV'19 WIDER Challenge (Track 4), and I also received a TAMU Engineering Genesis Award.
  • 2018: Our group published 1 ICLR, 2 NeurIPS, 1 ICML, 1 AISTATS, 1 ECCV, 1 KDD, 1 IJCAI, 1 AAAI, 3 IEEE TIP, and more. We also won two challenge prizes from CVPR'18 UG2 and ECCV'18 ChaLearn (Track 3).

[See more in News]

Open Calls


  • Multiple openings for Ph.D. and visiting students (scroll down to the page bottom).
  • Call for Participation & Papers: ECCV 2020 Workshop on Real-world Recognition from Low-quality Inputs and 1st Tiny Object Detection Challenge (RLQ-TOD) [Website]
  • Call for Papers: IJCAI 2020 International Workshop on Biomedical infOrmatics with Optimization and Machine learning (BOOM) [Website]


Research Narrative

I live in the blessed world of machine learning and computer vision. My research interests constantly evolve: below are just some recent stuffs that I work on. I always stay open to be intellectually excited and inspired by new things.


[A] As Goals -- Enhancing Deep Learning Robustness, Efficiency, and Privacy

I seek to build deep learning solutions that are way beyond just data-driven accurate predictors. IMHO, an ideal model shall at least: (1) be robust to perturbations and attacks (therefore trustworthy); (2) be efficient and hardware-friendly (for deployments in practical platforms); and (3) be designed to respect individual privacy and fairness.

    • Robustness: We are keen on improving deep model robustness, by addressing:
    • Efficiency: We conduct some pioneering work of energy-efficient training for deep models [ICLR'20, NeurIPS'19, ECCV'20]. For inference, we look at reducing model size and latency [ICML'18, IEEE TMC'20, ECCV'20], energy cost [IEEE JSTSP'19, AAAI'20], and memory footprint [CVPR'19]. We also actively collaborate with hardware experts to pursue efficient algorithm-hardware co-design [ISCA'20].
    • Privacy: We address privacy leak risks that arise in the share of training data [arxiv'19, ECCV'18] and trained models, using the techniques from adversarial learning and differential privacy. We released the first privacy-preserving video recognition dataset, PA-HMDB51.


[B] As Toolkits -- Automated Machine Learning (AutoML), and Learning-Augmented Optimization

I am growingly enthusiastic about the rising field of AutoML, on both consolidating its theoretical underpinnings and broadening its practical applicability. State-of-the-art ML systems consist of complex pipelines, with choices of model architectures, algorithms and hyperparameters, as well as other configuration details to be tuned for optimal performance. They further often need to be co-designed with multiple goals and constraints. I consider AutoML to be a powerful tool and a central hub, in addressing those design challenges faster and better.

    • Neural Architecture Search (NAS): We unleash the potential of NAS on more complicated model types, task scenarios, and data modalities, which present magnitudes of research challenges that easily fail existing NAS algorithms for simpler models and "toy" tasks. Our representative works include the first NAS for GAN [ICCV'19, ICML'20], the first NAS for speech (speaker recognition) [arXiv'20], the first NAS for Bayesian deep networks [ICML'20], and the state-of-the-art NAS for real-time segmentation [ICLR'20].
    • Learning-Augmented Optimization (L2O): A fast rising subfield of AutoML investigates to use ML to develop data and task-specific optimization algorithms, a.k.a, learning to optimize. This subfield is pre-mature in both theory and practice, where I see BIG opportunities:
        • Theory: While classic optimization results often provide worst-case guarantees, limited theory exists pertaining to learned optimizers. Our works pioneer on establishing convergence (rates) and simplifying parameter complexity, for convex problems such as LASSO [NeurIPS'18, ICLR'19], plug-and-play optimization with learned priors [ICML'19], and solving SVD using linear autoencoders [ICML'20]. We recently study the "unseen generalizability" of learned optimizers for the first time [arXiv'20].
        • Practice: To enlarge the practical scope of learned optimizers in handling intractable optimization, we explore their usage in Bayesian swarm optimization [NeurIPS'19] , domain generalization [ICML'20], hardware-aware on-device training [ECCV'20], noisy label training [ICML'20], and graph network training [CVPR'20, ICML'20]. My earlier works also empirically suggested the benefit of incorporating optimization-inspired building blocks to designing better deep models [CVPR'16, AAAI'16, KDD'18, NeurIPS'18, etc.].


[C] As Applications -- Computer Vision and Interdisciplinary Problems

Sponsorship

VITA is gratefully sponsored by (details available upon requests):

  • Government: NSF (5 grants), DoD (5 grants; including DARPA and ARL)
  • Industry: IBM Research, Adobe, Microsoft, NEC Labs, AWS, Chevron, Walmart, Kuaishou, USAA, Varian, MoodMe
  • University: TAMU X-Grant (2018), T3-Grant (2019), PESCA (2019)

Notes to Prospective Students


  • I am always looking for truly strong Ph.D. students, for every semester. Research assistantships (RAs) will be provided. Interested candidates please email me their CVs, transcripts, and brief research statements.
  • TAMU is a great place for AI/ML/CV research. TAMU has been most renowned for its world-class College of Engineering (ranked 11th by US News 2017). According to csrank.org, in the year 2018, TAMU CSE department is ranked 26th nation-wide in all CS research areas, and is ranked 22th in the specific field of AI.
  • I firmly believe in the values of two things:
      1. a truly deep understanding of your problem of interest - don’t naively plug and play "hot" tools;
      2. a solid background and a true passion for mathematics - I constantly benefit from digging more from matrix analysis, optimization, and statistical learning.
  • ... But eventually, nothing is more important than a true enthusiasm and devotion to research.
  • I am hands-on and work very closely with my every student. I also provide strong support to my students for internship, visiting, and collaboration opportunities, as you can see from my Group.
  • We welcome highly motivated M.S. students and undergraduates to explore research with us. Self-funded visiting students /scholars are welcome to apply.