Computer Science, Machine Learning
Author: LLama 2 7B Chat
Page 90/179
LLaMA-2, the next generation of LLaMA. Meta trained and released LLaMA-2 in three model sizes: 7, 13, and 70 billion parameters. The model architecture remains largely unchanged from that of LLaMA-1 models, but 40% more data was used to train the foundational models. The accompanying preprint also mentions a model with 34B parameters that might be released in the future upon satisfying safety targets.
Computer Science, Machine Learning
Pruning Deep Neural Networks for Efficient Inference and Training
Computer Science, Computer Vision and Pattern Recognition
Large-Scale Pre-Trained DMs Enable Efficient and Fidelity-Aware Style Transfer
Ethics and Regulation in the Age of Quantum AI: Navigating the Complexity
Computer Science, Information Retrieval
Mitigating Bias in MOOC Recommendation Systems Through Enriched Knowledge Graphs
Computer Science, Computers and Society
Mitigating Bias in Predictive Modeling: A Focus on Locally Optimal Models
Computer Science, Computer Vision and Pattern Recognition
Realtime Human Motion Capture Using Inverse Kinematics and von Mises-Fisher Sampling
Computer Science, Software Engineering
Code Summarization and Awareness Framework for Repositories
Computer Science, Computer Vision and Pattern Recognition
Emotion Recognition in Speech: A Comparative Study of Different Models
Computer Science, Machine Learning
Differentiable Particle Filters for Data-Adaptive Sequential Bayesian Inference
Computer Science, Machine Learning
Inferring Structural Inference Methods for Nonlinear Dynamic Systems
Computer Science, Human-Computer Interaction
Modeling and Visualizing ChatGPT’s Topics: Understanding the Capabilities and Limitations of Large Language Models
Computer Science, Information Retrieval
Improving Data Quality and Preprocessing for Better Predictions in Audio Features Analysis
Computer Science, Information Retrieval
Fairness in Federated Recommender Systems: A Comprehensive Survey
Computer Science, Machine Learning
Post-processing Bias Mitigation: A Time-Saving Approach for Large Data Sets
Computer Science, Machine Learning
Low-Precision Training of Deep Neural Networks: Challenges and Solutions
Computer Science, Information Retrieval
Attention is All You Need: A Comprehensive Review of Contrastive Learning for Click-Through Rate Prediction
Computer Science, Computers and Society
Understanding Scientific Collaboration in Artificial Intelligence Research: A Bibliometric Analysis
Optimal Control of a Pendulum System with HEBI X5-9 Actuator: Naming Convention and Experimental Setup
Computer Science, Information Theory