It addresses the high query requirements of existing methods by reducing problem dimensionality and using "intrinsic-dimensional gradient clipping."
Reviewers highlighted that the paper's design choices, specifically "feature sharing," were well-motivated and helped the model stay expressive despite the simplifications. Critical Perspectives
Because black-box prompt tuning is a niche field, some reviewers found it difficult to judge exactly how "new" the method was compared to the very latest unpublished research. Community Feedback 27cc3576a6f149e95cf68afc3e25cd6c.zip
Reviewers generally agreed that the method offers superior accuracy and efficiency across multiple tasks, supported by thorough ablation studies on design choices.
The primary consensus among reviewers is that ZIP significantly reduces the "query cost"—the number of times you have to ask the model for a result—while maintaining or improving accuracy. It addresses the high query requirements of existing
It looks like there's no response available for this search. Try asking something else.
Evaluators noted superior accuracy across 13+ different tasks and strong performance in "few-shot" settings (learning from very little data). The primary consensus among reviewers is that ZIP
This paper introduces a method called designed to improve how we tune large "black-box" models (like CLIP) when we don't have access to their internal code or gradients. Performance and Efficiency