Computer Science, Machine Learning
Author: LLama 2 7B Chat
Page 75/179
LLaMA-2, the next generation of LLaMA. Meta trained and released LLaMA-2 in three model sizes: 7, 13, and 70 billion parameters. The model architecture remains largely unchanged from that of LLaMA-1 models, but 40% more data was used to train the foundational models. The accompanying preprint also mentions a model with 34B parameters that might be released in the future upon satisfying safety targets.
Computation and Language, Computer Science
Enhancing Comparative Opinion Mining with Multi-Dimensional Embeddings
Computer Science, Information Theory
Optimizing RIS Reflection Coefficients for Tight Cramr-Rao Lower Bounds
Evaluating and Improving Machine Learning Models for Regioselectivity Prediction in Electrophilic Aromatic Substitution Reactions
Computer Science, Computer Vision and Pattern Recognition
Neural Radiance Fields with Reflections: A Comprehensive Review
Computer Science, Machine Learning
Efficient Model-Heterogeneous Personalized Federated Learning via Semantic Similarity-based Aggregation
Computer Science, Information Theory
AI Hype vs. Reality: Navigating Production Pitfalls in Automated Content Creation
Artificial Intelligence, Computer Science
Scaling Multi-Agent Reinforcement Learning with Selective Parameter Sharing
Computer Science, Machine Learning
Exploring the Power of Network Analysis: A Comprehensive Review of Datasets and Performance Metrics
Computer Science, Machine Learning
Symmetry in Machine Learning: A Fundamental Concept and Its Applications
Electrical Engineering and Systems Science, Systems and Control
Detecting and Isolating Pitch System Faults in Offshore Wind Turbines: A Review
Computer Science, Computer Vision and Pattern Recognition
Robustness Transfer in Deep Learning: A Single Model’s Solution to Multiple Noise Levels
Electrical Engineering and Systems Science, Image and Video Processing
Unlocking Brain Functional Networks via Graph Convolutional Neural Networks
Computer Science, Machine Learning
Federated Graph Neural Network Learning with Heterogeneous Datasets: A Comparative Study of Clustering and Community Detection Methods
Computation and Language, Computer Science
Adapting LLMs for NER in Astronomical Literature via Knowledge Graphs and Prompt Engineering
Optimizing Redundant Manipulator Motion with Bayesian Monte Carlo and Importance Sampling
Computation and Language, Computer Science
Adaptive Rounding for Post-Training Quantization in Large Language Models
Computational Geometry, Computer Science
Realizing Directional Walks: R-Complexity of Embedding Without Repeating Edges
Computer Science, Computer Vision and Pattern Recognition
Unlocking Image Quality: State-of-the-Art Deblurring and Enhancement Techniques
Computer Science, Hardware Architecture