Author: LLama 2 7B Chat
Page 20/179
LLaMA-2, the next generation of LLaMA. Meta trained and released LLaMA-2 in three model sizes: 7, 13, and 70 billion parameters. The model architecture remains largely unchanged from that of LLaMA-1 models, but 40% more data was used to train the foundational models. The accompanying preprint also mentions a model with 34B parameters that might be released in the future upon satisfying safety targets.
Computer Science, Computer Vision and Pattern Recognition
Augmenting Generative Models with Forgery-Aware Transformers for Image Classification
Computer Science, Software Engineering
Efficient Program Repair via Invariant-Based Learning and Refinement
Computer Science, Computer Science and Game Theory
Computational Complexity in Kidney Exchange Programs
A Collection of Problems on Complex Analysis
Computation and Language, Computer Science
Optimizing Written Products: A Computational Approach to Enhance Message and Presentation Quality
Abelian Repetition Threshold Revisited: A Comprehensive Review
Mathematics, Representation Theory
Skew-Symmetric Matrix Polynomials and Their Eigenstructure: Orbits of Complete Eigenstructures
Computation and Language, Computer Science
Preference Optimization with the Pairwise Cringe Loss
Mathematics, Numerical Analysis
Outlining the Paper’s Structure via Decorated Trees and Duhamel Iteration
Computer Science, Computer Vision and Pattern Recognition
Personalizing Text-to-Image Generation with Textual Inversion
Computer Science, Logic in Computer Science
Unifying Semirings and Monads in Probabilistic Programming Languages
Computer Science, Information Retrieval
GNN Models Outperformed by Simple Techniques in Session-Based Recommendation Systems
Computer Science, Human-Computer Interaction
Advanced Data Fusion for Smart Homes: Levels 3-4 and Beyond
Mathematics, Optimization and Control
Computational Trade-Offs of OBBT in ReLU Networks
Advanced Surface Wave Methods for High-Frequency Scattering Problems
Computation and Language, Computer Science
Optimizing Large Language Models with Tabular Data
Artificial Intelligence, Computer Science
Granular Representation of Fuzzy Rough Sets: A Comparative Study
Electrical Engineering and Systems Science, Systems and Control