LangGPT
LangGPT is a personified structural prompt framework. It presents a structured, flexible and extensible framework for prompt creation with reference to the design ideas of object-oriented programming languages. It gives identity to large language models, greatly improving their performance on specific tasks.
- URL: https://github.com/langgptai/LangGPT
- Paper: LangGPT: Rethinking Structured Reusable Prompt Design Framework for LLMs from the Programming Language
- Contribution: Core Contributor & Co-builder, responsible for theoretical system refinement and paper writing.
Minstrel
Minstrel is an automatic prompt optimization (or generation) tool. It decomposes the prompt generation task and automates the generation of structural LangGPT prompts through collaboration between generative agents belonging to three working groups.
- URL: https://github.com/langgptai/Minstrel
- Paper: Minstrel: Structural Prompt Generation with Multi-Agents Coordination for Non-AI Experts
- Contribution: Person in Charge, hosting the design, development, and testing of Minstrel.
T-COL
T-COL is a counterfactual explanation generation method. It utilizes a link tree structure to select several local feature subsets through a partitioning strategy, which are eventually stitched together into a counterfactual interpretation.
- URL: https://github.com/sci-m-wang/T-COL
- Paper: T-COL: Generating Counterfactual Explanations for General User Preferences on Variable Machine Learning Systems
- Contribution: Person in Charge, hosting the design, development, and testing of T-COL.
MM-Bigbench
MM-BigBench, with a range of diverse metrics to thoroughly evaluate different models and instructions, including the Best Performance metric, the Mean Relative Gain metric, the Stability metric, and the Adaptability metric.
- URL: https://github.com/declare-lab/MM-BigBench
- Paper: MM-BigBench: Evaluating Multimodal Models on Multimodal Content Comprehension Tasks
- Contribution: Key Participant, evaluating multimodal large language models such as VPGTrans, LlaMa-Adapter, LlaVa, etc.
PICA
Although traditional language models play a good role in giving help, there is a problem of emotional participation deficiency, since long suggestive replies are against the goal of empathy. To solve the problem, we put forward the multi-turn dialogue model PICA, a chatbot that can empathize and rely on the emotion required.
- URL: https://github.com/NEU-DataMining/PICA
- Contribution: Key Participant, participating in data construction and supervised fine-tuning.
Funds
A Study of Interpretable Dialogue Generation Techniques for Emotion Awareness (No.62172086)
Student Participant; Funding: ¥600,000
National Natural Science Foundation of China (NSFC)
Internet of Things Teaching Experiment Box (No.201810217141)
Person in Charge; Funding: ¥10,000
National Student Innovation Training Projects
Smart Home System Based on Internet of Things (No.201810217140)
Key Participants; Funding: ¥15,000
National Student Innovation Training Projects
Adsorption Type Underwater Hull Surface Inspection Robot (No.201810217024)
Key Participants; Funding: ¥12,000
National Student Innovation Training Projects
Ant Travel (No.201810217256)
Key Participants; Funding: ¥27,000
National Student Entrepreneurship Training Projects
Indoor Space Optimisation and Beautification System Based on Deep Learning (No.Z-2018-039)
Person in Charge; Funding: ¥5,000
Harbin Engineering University Student Innovation Training Projects (Significant type)