About

Membership

Apple debuts OpenELM, a family of language models with 270M, 450M, 1.1B, and 3B parameters, designed to run on-device, pre-trained and fine-tuned on public data (Shubham Sharma/VentureBeat)

Shubham Sharma / VentureBeat:
Apple debuts OpenELM, a family of language models with 270M, 450M, 1.1B, and 3B parameters, designed to run on-device, pre-trained and fine-tuned on public data  —  Just as Google, Samsung and Microsoft continue to push their efforts with generative AI on PCs and mobile devices …



from Techmeme https://ift.tt/4jexE1a

Post a Comment

0 Comments