📢 Gate Square Exclusive: #PUBLIC Creative Contest# Is Now Live!
Join Gate Launchpool Round 297 — PublicAI (PUBLIC) and share your post on Gate Square for a chance to win from a 4,000 $PUBLIC prize pool
🎨 Event Period
Aug 18, 2025, 10:00 – Aug 22, 2025, 16:00 (UTC)
📌 How to Participate
Post original content on Gate Square related to PublicAI (PUBLIC) or the ongoing Launchpool event
Content must be at least 100 words (analysis, tutorials, creative graphics, reviews, etc.)
Add hashtag: #PUBLIC Creative Contest#
Include screenshots of your Launchpool participation (e.g., staking record, reward
The Fourth Paradigm introduces ModelHub AIoT, a large model inference edge solution
Jinshi data news on February 26th, it was learned from the Fourth Paradigm that the Fourth Paradigm launched the large model inference edge solution ModelHub AIoT, and users can easily deploy small distillation models such as DeepSeek R1, Qwen 2.5, Llama 2/3 series, and achieve offline operation at the edge. Users can flexibly switch between multiple models, taking into account model compression and inference performance, solving the complexity of deployment and optimization. The company stated that the solution not only meets users' demands for privacy and real-time performance, but also greatly reduces the cost of AI large model inference.