HN Points | HN Title (Links to original post) | Submitted Date |
---|---|---|
4 | How are people training this LLMs? Dont they need lot of money? | 2024-01-19 |
3 | Serving Open Source Models 4x faster than vLLM by quantizing with ~no tradeoffs | 2024-01-10 |
3 | FireAttention – Serving Mixtral and open-source MoE models at 4x speed vs. vLLM | 2024-01-09 |