Token estimation, API costs, model sizing, training metrics, and more
Estimate token count for LLM input text
Estimate OpenAI, Anthropic, Google API costs
Estimate GPU memory for model parameters
Calculate precision, recall, F1 from a confusion matrix
Estimate compute (FLOPS) for training/inference
Estimate training time from dataset and hardware
Calculate perplexity from cross-entropy loss
Estimate optimal embedding dimensions
Optimal batch size for GPU memory
Suggested learning rates by model size
Calculate pixel count, aspect ratio, and memory
Apply softmax function to a set of values