Show HN: Rotta-Rs, Deep Learning Framework in Rust

4 months ago 3

image alt

AI framework built on the rust programming language

  • Softplus
  • ln
  • powi
  • sigmoid
  • mul operation for tensor
  • sub operation for tensor
  • rename 'reshape' method to 'to_shape' in Arrayy
  • update the algorithm of cross entropy loss
  • update the algorithm of indexing on Arrayy
  • fix bug in tensor broadcasting
  • fix bug in derivative of divided for tensor

you can see other versions via this link

You can see what changes have occurred in the previous version at this link

for now ROTTA-rs is still not available on crates.io, to use ROTTA-rs you can access the zip file on this link and extract it into your rust project.

ROTTA-rs.zip

note: ROTTA-rs also uses external dependencies, don't forget to add them in Cargo.toml

dependencies version features
rand 0.9.1 _
rand_distr 0.5.1 _
uuid 1.17.0 v4

suggestion: for convenience you can extract it into the src folder along with the main.rs file then access the ROTTA-rs module using:

the only data types possible on tensors are f64

There are 3 ways to create a tensor

mod rotta_rs; fn main() { let tensor = Tensor::new([ [1.0, 2.0, 3.0], [4.0, 5.0, 6.0], ]); // let tensor = Tensor::from_vector(vec![3, 2], vec![1.0, 2.0, 3.0, 4.0, 5.0, 6.0]); // let arrayy = Arrayy::new([ [1.0, 2.0, 3.0], [4.0, 5.0, 6.0], ]); let tensor = Tensor::from_arrayy(arrayy); }

Basic Operations On Tensors

This version still has many shortcomings in the operations that can be performed on tensors, including:

  • add
mod rotta_rs; fn main() { let tensor_a = Tensor::new([ [1.0, 2.0, 3.0], [1.0, 2.0, 3.0], ]); let tensor_b = Tensor::new([ [1.0, 2.0, 3.0], [1.0, 2.0, 3.0], ]); let result = &tensor_a + &tensor_b; // or let result = add(&tensor_a, &tensor_b); }
  • sub
mod rotta_rs; fn main() { let tensor_a = Tensor::new([ [1.0, 2.0, 3.0], [1.0, 2.0, 3.0], ]); let tensor_b = Tensor::new([ [1.0, 2.0, 3.0], [1.0, 2.0, 3.0], ]); let result = &tensor_a - &tensor_b; // or let result = sub(&tensor_a, &tensor_b); }
  • mul
mod rotta_rs; fn main() { let tensor_a = Tensor::new([ [1.0, 2.0, 3.0], [1.0, 2.0, 3.0], ]); let tensor_b = Tensor::new([ [1.0, 2.0, 3.0], [1.0, 2.0, 3.0], ]); let result = &tensor_a * &tensor_b; // or let result = mul(&tensor_a, &tensor_b); }
  • devided
mod rotta_rs; fn main() { let tensor_a = Tensor::new([ [1.0, 2.0, 3.0], [1.0, 2.0, 3.0], ]); let tensor_b = Tensor::new([ [1.0, 2.0, 3.0], [1.0, 2.0, 3.0], ]); let result = &tensor_a / &tensor_b; // or let result = divided(&tensor_a, &tensor_b); }
  • dot product
mod rotta_rs; fn main() { let tensor_a = Tensor::new([1.0, 2.0, 3.0]); let tensor_b = Tensor::new([1.0, 2.0, 3.0]); let result = dot(&tensor_a, &tensor_b); }
  • matmul
mod rotta_rs; fn main() { let tensor_a = Tensor::new([ [1.0, 2.0, 3.0], [1.0, 2.0, 3.0], ]); let tensor_b = Tensor::new([ [1.0, 2.0], [1.0, 2.0], [1.0, 2.0], ]); let result = matmul(&tensor_a, &tensor_b); }

other operations

  • exp
  • sum axis
  • powi
  • ln
mod rotta_rs; fn main() { let mut model = Module::init(); let optimazer = Sgd::init(model.parameters(), 0.00001); let loss_fn = SSResidual::init(); let linear = model.liniar_init(1, 1); }
mod rotta_rs; fn main() { let mut model = Module::init(); let optimazer = Sgd::init(model.parameters(), 0.00001); let loss_fn = SSResidual::init(); let linear = model.liniar_init(1, 1); let linear_2 = model.liniar_init(1, 1); let input = Tensor::new([[1.0], [2.0]]); let actual = Tensor::new([[1.0], [4.0]]); for epoch in 0..100 { let x = linear.forward(&input); let x = relu(&x); let output = linear_2.forward(&x); let loss = loss_fn.forward(&output, &actual); println!("epoch:{epoch} | loss => {loss}"); optimazer.zero_grad(); loss.backward(); optimazer.optim(); } }

Support Developer With Donations

  • Trakteer

https://trakteer.id/araxnoid/tip

follow me on social media to get the latest updates

  • youtube

araxnoid

click here to go directly to youtube

  • tiktok

araxnoid

click here to go directly to tiktok

  • Gmail

[email protected]

Read Entire Article