Skip to content
GitHub
View on GitHub

GLM-4.7

API reference for GLM_4_7

from modal_training_gym.common.models.glm_4_7 import GLM_4_7

GLM-4.7 large MoE model from Zhipu AI.

Inherits from: HFModelConfiguration, ModelConfiguration

FieldTypeDefaultDescription
model_namestr"zai-org/GLM-4.7"HuggingFace repo ID or other model identifier. Default "".
model_path`strNone`None
architecture`ModelArchitectureNone`None
training`ModelTrainingConfigNone`ModelTrainingConfig(gpu_type='H100', n_nodes=4, tensor_model_parallel_size=2, pipeline_model_parallel_size=4, context_parallel_size=1, sequence_parallel=True, expert_model_parallel_size=4, moe_permute_fusion=True, moe_grouped_gemm=True, moe_shared_expert_overlap=True, moe_aux_loss_coeff=0.001, lora_rank=128, lora_alpha=32, target_modules='all-linear', merge_lora=False)

Download or materialize weights into the model volume.

Source: modal_training_gym/common/models/glm_4_7.py