Expand description
candle-nn
§Other Crates
Candle consists of a number of crates. This crate holds structs and functions that allow you to build and train neural nets. You may wish to look at the docs for the other crates which can be found here:
- candle-core. Core Datastructures and DataTypes.
- candle-nn. Building blocks for Neural Nets.
- candle-datasets. Rust access to commonly used Datasets like MNIST.
- candle-examples. Examples of Candle in Use.
- candle-onnx. Loading and using ONNX models.
- candle-pyo3. Access to Candle from Python.
- candle-transformers. Candle implemntation of many published transformer models.
Re-exports§
pub use activation::prelu;
pub use activation::Activation;
pub use activation::PReLU;
pub use attention::scaled_dot_product_attention;
pub use batch_norm::batch_norm;
pub use batch_norm::BatchNorm;
pub use batch_norm::BatchNormConfig;
pub use conv::conv1d;
pub use conv::conv1d_no_bias;
pub use conv::conv2d;
pub use conv::conv2d_no_bias;
pub use conv::conv_transpose1d;
pub use conv::conv_transpose1d_no_bias;
pub use conv::conv_transpose2d;
pub use conv::conv_transpose2d_no_bias;
pub use conv::Conv1d;
pub use conv::Conv1dConfig;
pub use conv::Conv2d;
pub use conv::Conv2dConfig;
pub use conv::ConvTranspose1d;
pub use conv::ConvTranspose1dConfig;
pub use conv::ConvTranspose2d;
pub use conv::ConvTranspose2dConfig;
pub use embedding::embedding;
pub use embedding::Embedding;
pub use func::func;
pub use func::func_t;
pub use func::Func;
pub use func::FuncT;
pub use group_norm::group_norm;
pub use group_norm::GroupNorm;
pub use init::Init;
pub use layer_norm::layer_norm;
pub use layer_norm::rms_norm_non_quant;
pub use layer_norm::rms_norm_quant;
pub use layer_norm::LayerNorm;
pub use layer_norm::LayerNormConfig;
pub use layer_norm::RmsNorm;
pub use linear::linear;
pub use linear::linear_b;
pub use linear::linear_no_bias;
pub use linear::Linear;
pub use ops::kvconcat;
pub use ops::Dropout;
pub use optim::AdamW;
pub use optim::Optimizer;
pub use optim::ParamsAdamW;
pub use optim::SGD;
pub use rnn::gru;
pub use rnn::lstm;
pub use rnn::GRUConfig;
pub use rnn::LSTMConfig;
pub use rnn::GRU;
pub use rnn::LSTM;
pub use rnn::RNN;
pub use rope::RotaryEmbedding;
pub use sequential::seq;
pub use sequential::Sequential;
pub use var_builder::VarBuilder;
pub use var_map::VarMap;
pub use crate::core::Module;
pub use crate::core::ModuleT;
Modules§
- Activation Functions
- Batch Normalization.
- Convolution Layers.
- Embedding Layer.
- Encoding Utilities. (e.g., one-hot/cold encoding)
- Layers defined by closures.
- Group Normalization.
- Variable initialization.
- Cache Implementations
- Layer Normalization.
- Linear layer
- Loss Calculations
- Tensor ops.
- Various optimization algorithms.
- Recurrent Neural Networks
- Rotary Embeddings
- Sequential Layer
- A
VarBuilder
is used to retrieve variables used by a model. These variables can either come from a pre-trained checkpoint, e.g. usingVarBuilder::from_mmaped_safetensors
, or initialized for training, e.g. usingVarBuilder::from_varmap
.