Jae Hutchinson's picture

Jae Hutchinson

sirmyrrh
Β·

AI & ML interests

None yet

Recent Activity

Organizations

None yet

sirmyrrh's activity

New activity in Undi95/Lumimaid-Magnum-12B 25 days ago

Magnum v4

6
#4 opened about 1 month ago by
EloyOn
reacted to DawnC's post with πŸ€— about 1 month ago
view post
Post
1423
πŸ’‘ Curious about dog breeds? πŸ• Meet PawMatchAI!
I've created this fun and interactive project to help you recognize dog breeds, find the perfect pup for your lifestyle, and even compare different breeds! Recently upgraded with smarter AI detection - it can now better distinguish between dogs and non-dogs (no more confusing cats for huskies! πŸ˜Ίβž‘οΈπŸ•).

🐾 What's cool about it?
Smart breed recognition powered by AI
Lifestyle-based breed recommendations
Detailed breed comparisons
And now with enhanced non-dog filtering!

🌟 Why try it?
Whether you're a dog lover, considering a new furry friend, or just curious, PawMatchAI makes discovering breeds fun and informative! As someone passionate about both AI and pets, I'm combining my two loves while working toward my goal of contributing to the AI industry.

πŸ”Ž Got feedback?
While it's not perfect, your input helps make it better! I'd love to hear your thoughts as I continue improving this project on my journey into AI development.

πŸ‘‰ Try it now: DawnC/PawMatchAI

🎯 Your support matters!
Every like πŸ‘ or comment πŸ“ helps fuel my passion for AI development and keeps me motivated to create more helpful tools. Let's make the AI journey fun and impactful together!

#AI #MachineLearning #DeepLearning #Pytorch #ComputerVision
reacted to MoritzLaurer's post with πŸ‘ about 1 month ago
view post
Post
1286
I've been building a small library for working with prompt templates on the HF hub: pip install prompt-templates. Motivation:

The community currently shares prompt templates in a wide variety of formats: in datasets, in model cards, as strings in .py files, as .txt/.yaml/.json/.jinja2 files etc. This makes sharing and working with prompt templates unnecessarily complicated.

Prompt templates are currently the main hyperparameter that people tune when building complex LLM systems or agents. If we don't have a common standard for sharing them, we cannot systematically test and improve our systems. After comparing different community approaches, I think that working with modular .yaml or .json files is the best approach.

The prompt-templates library :
- proposes a standard for sharing prompts (entirely locally or on the HF hub)
- provides some utilities that are interoperable with the broader ecosystem

Try it:
# !pip install prompt-templates
from prompt_templates import PromptTemplateLoader 
prompt_template = PromptTemplateLoader.from_hub(repo_id="MoritzLaurer/closed_system_prompts", filename="claude-3-5-artifacts-leak-210624.yaml")


The library is in early stages, feedback is welcome!

More details in the docs: https://github.com/MoritzLaurer/prompt_templates/
  • 1 reply
Β·
reacted to julien-c's post with 😎 about 1 month ago
view post
Post
8275
After some heated discussion πŸ”₯, we clarify our intent re. storage limits on the Hub

TL;DR:
- public storage is free, and (unless blatant abuse) unlimited. We do ask that you consider upgrading to PRO and/or Enterprise Hub if possible
- private storage is paid above a significant free tier (1TB if you have a paid account, 100GB otherwise)

docs: https://huggingface.co/docs/hub/storage-limits

We optimize our infrastructure continuously to scale our storage for the coming years of growth in Machine learning, to the benefit of the community πŸ”₯

cc: @reach-vb @pierric @victor and the HF team
Β·
reacted to Kseniase's post with πŸ”₯ about 1 month ago
view post
Post
2829
TL;DR: The Story of Attention's Development by @karpathy

Origin: First proposed in 2014 by @Dzmitry Bahdanau, @KyunghyunCho , and Yoshua Bengio in Neural Machine Translation by Jointly Learning to Align and Translate (1409.0473) . Inspired by cognitive processes and later renamed from "RNNSearch."

Key Idea: A data-dependent weighted average for pooling and communication, enabling flexible and powerful neural network connections.

Breakthrough: Bahdanau's "soft search" mechanism (softmax + weighted averaging) solved encoder-decoder bottlenecks in machine translation.
Transformer Revolution: Attention Is All You Need (1706.03762) (2017) by @ashishvaswanigoogle et al. simplified architectures by stacking attention layers, introducing multi-headed attention and positional encodings.
Legacy: Attention replaced RNNs, driving modern AI systems like ChatGPT. It emerged independently but was influenced by contemporaneous work like Alex Graves’s Neural Turing Machines (1410.5401) and Jason Weston’s Memory Networks (1410.3916) .

Attention to history: JΓΌrgen Schmidhuber claims his 1992 Fast Weight Programmers anticipated modern attention mechanisms. While conceptually similar, the term β€œattention” was absent, and there’s no evidence it influenced Bahdanau, Cho, and Bengio’s 2014 work. Paying attention (!) to history might have brought us to genAI earlier – but credit for the breakthrough still goes to Montreal.

Referenced Papers:
Attention Origin: Neural Machine Translation by Jointly Learning to Align and Translate (1409.0473)
Transformers: Attention Is All You Need (1706.03762)
Alex Graves' Work: Neural Turing Machines (1410.5401), Generating Sequences With Recurrent Neural Networks (1308.0850)
Jason Weston @spermwhale 's Memory Networks (1410.3916)
Sequence to Sequence Learning with Neural Networks (1409.3215) by Ilya Sutskever ( @ilyasut ), Oriol Vinyals, Quoc V. Le

Who else deserves recognition in this groundbreaking narrative of innovation? Let’s ensure every contributor gets the credit they deserve. Leave a comment below πŸ‘‡πŸ»πŸ€—
Β·