DEV Community

Cover image for First Cross-App Multimodal Search Dataset Shows How Users Navigate Mobile Apps
Mike Young
Mike Young

Posted on • Originally published at aimodels.fyi

First Cross-App Multimodal Search Dataset Shows How Users Navigate Mobile Apps

This is a Plain English Papers summary of a research paper called First Cross-App Multimodal Search Dataset Shows How Users Navigate Mobile Apps. If you like these kinds of analysis, you should join AImodels.fyi or follow us on Twitter.

Overview

  • Qilin dataset contains 8.4 million multimodal search sessions from real mobile app usage
  • Features interactions across 9 different mobile apps with diverse content types
  • Includes text, image, and hybrid search queries with corresponding search results
  • First dataset to track user behaviors across multiple apps in sequence
  • Contains 2.2 million unique images and 6.9 million text documents
  • Enables research on how users navigate between different mobile applications

Plain English Explanation

The Qilin dataset is like a digital diary of how people actually search for information on their phones. Instead of just tracking what people search for on a single app like Google, it follows users as they hop between different apps like news readers, social media, e-commerce ...

Click here to read the full summary of this paper

Top comments (0)