Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Mamba (minimal) #918

Closed
wants to merge 27 commits into from
Closed

Add Mamba (minimal) #918

wants to merge 27 commits into from

Commits on Jan 26, 2024

  1. Configuration menu
    Copy the full SHA
    5c532ec View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    fb91f13 View commit details
    Browse the repository at this point in the history

Commits on Feb 1, 2024

  1. Update safetensors module and naming

    - Makes the safetensors module private.
      - Doesn't get exported on the preamble, avoiding a naming clash with the safetensors external crate.
    - Change how and when the period is inserted.
      - This should make it closer to how the fields are accessed in the code.
    swfsql committed Feb 1, 2024
    Configuration menu
    Copy the full SHA
    a832f51 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    901cfe4 View commit details
    Browse the repository at this point in the history
  3. add SiLU activation function

    swfsql committed Feb 1, 2024
    Configuration menu
    Copy the full SHA
    a14b40b View commit details
    Browse the repository at this point in the history
  4. add RMS normalization

    - Add the try_normalize_rms related functions.
    - Add the `LayerRMSNorm1D` module.
    swfsql committed Feb 1, 2024
    Configuration menu
    Copy the full SHA
    b52932c View commit details
    Browse the repository at this point in the history
  5. Add split_tensor_along method

    - Add `TrySplitShapeAlong` and `TrySplitTensorAlong`.
    - Minor linting and docs fix.
    
    TODO
    - Check if the tape should be returned. If not, it can be removed from the interface.
    - Add cuda kernel.
    - Consider a different interface, where it could get split in more than two tensors - possibly stated on a vec.
      In this way it could get closer to the pytorch interface (chunks).
    swfsql committed Feb 1, 2024
    Configuration menu
    Copy the full SHA
    693b699 View commit details
    Browse the repository at this point in the history
  6. rm unrelated derive

    swfsql committed Feb 1, 2024
    Configuration menu
    Copy the full SHA
    de55567 View commit details
    Browse the repository at this point in the history

Commits on Feb 2, 2024

  1. Configuration menu
    Copy the full SHA
    3122f78 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    ace3808 View commit details
    Browse the repository at this point in the history
  3. Configuration menu
    Copy the full SHA
    f6d06e0 View commit details
    Browse the repository at this point in the history

Commits on Feb 6, 2024

  1. Added TryUnstack for tensors.

    - Also added `from_fn` for Arrays.
    
    Note: the interface currently requires two passes for construction, one for creating a list of tensors with NoneTape and another for putting tapes into those tensors.
    swfsql committed Feb 6, 2024
    Configuration menu
    Copy the full SHA
    ea424c3 View commit details
    Browse the repository at this point in the history
  2. fix wgpu signature

    swfsql committed Feb 6, 2024
    Configuration menu
    Copy the full SHA
    5994ac5 View commit details
    Browse the repository at this point in the history

Commits on Feb 7, 2024

  1. Merge pull request #1 from rainiwu/remove-ftz

    Remove ftz
    swfsql committed Feb 7, 2024
    Configuration menu
    Copy the full SHA
    24a8593 View commit details
    Browse the repository at this point in the history

Commits on Feb 8, 2024

  1. Configuration menu
    Copy the full SHA
    5ffff2d View commit details
    Browse the repository at this point in the history

Commits on Feb 9, 2024

  1. Added {load/read/save/write}_safetensor_with methods

    This alternative method:
    - Requires load/read to decide whether it should skip missing tensors;
    - Requires load/read/save/write to decide how should keys be mapped.
    swfsql committed Feb 9, 2024
    Configuration menu
    Copy the full SHA
    e883b28 View commit details
    Browse the repository at this point in the history
  2. unstack fixes

    swfsql committed Feb 9, 2024
    Configuration menu
    Copy the full SHA
    c695a15 View commit details
    Browse the repository at this point in the history
  3. Configuration menu
    Copy the full SHA
    4141e06 View commit details
    Browse the repository at this point in the history
  4. Configuration menu
    Copy the full SHA
    8202b20 View commit details
    Browse the repository at this point in the history
  5. Configuration menu
    Copy the full SHA
    34234e2 View commit details
    Browse the repository at this point in the history

Commits on Feb 20, 2024

  1. silu: fix cpu df

    swfsql committed Feb 20, 2024
    Configuration menu
    Copy the full SHA
    93202ad View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    eb70a88 View commit details
    Browse the repository at this point in the history
  3. avoid conv1d bound for cudnn

    swfsql committed Feb 20, 2024
    Configuration menu
    Copy the full SHA
    fde7a40 View commit details
    Browse the repository at this point in the history
  4. bump gemm

    swfsql committed Feb 20, 2024
    Configuration menu
    Copy the full SHA
    75d63cd View commit details
    Browse the repository at this point in the history
  5. clippy fix

    swfsql committed Feb 20, 2024
    Configuration menu
    Copy the full SHA
    f0bcb9a View commit details
    Browse the repository at this point in the history
  6. Add mamba-minimal

    - Add stateless forward impl.
      - Efficient for training (but training is not yet implemented).
      - Input requires the entire sequence, and requires no state cache.
      - Generates one output for each input sequence.
    - Add stateful forward impl.
      - Efficient for inference.
      - Input requires the last single sequence point, and requires the last state cache.
      - Generates a single output referring to the last input.
    swfsql committed Feb 20, 2024
    Configuration menu
    Copy the full SHA
    cac2f33 View commit details
    Browse the repository at this point in the history
  7. Configuration menu
    Copy the full SHA
    bff1b65 View commit details
    Browse the repository at this point in the history