riyajatar37003

Results 29 issues of riyajatar37003

Hi, where exactly dropout is being applied ? can anyone point to code/file. thanks

i am passing these config.pbtxts **confgi-1** `name: "model_onnx" backend: "onnxruntime" max_batch_size: 128 input [ { name: "input_ids" data_type: TYPE_INT64 dims: [-1 ] }, { name: "attention_mask" data_type: TYPE_INT64 dims: [...

**Description** i have converted a pytorch model to onnx with fp16 precision. **Triton Information** 24.03 Are you using the Triton container or did you build it yourself? container **To Reproduce**...

`model-analyzer profile --run-config-profile-models-concurrently-enable --override-output-model-repository --model-repository model_repositories --profile-models model1\,model2 --output-model-repository-path ./model1_op --export-path model1_report` after running above command i am getting following error: Model Analyzer] Initializing GPUDevice handles [Model Analyzer] Using GPU...

investigating

i am using transformer model to generate embeddings inside a function and that function is apply on each row of dataframe using parallel_apply which throwing belwo error RuntimeError: Cannot re-initialize...

i am trying to understand the loss function : `def calc_loss(self, y_true, y_pred): """ 矩阵计算batch内的cos loss """ y_true = y_true[::2] norms = (y_pred ** 2).sum(axis=1, keepdims=True) ** 0.5 y_pred =...

question

Hi , can u share any example/command for these mode.? during launching i am doing this way "tritonserver --model-control-mode explicit --exit-on-error=false --model-repository=/tmp/models" and in the other container i am running...

Hi , is there any way we can pass our own custom test samples to model analyser during profiling.

Hi, Is there a way where we can train different retriever and reranker ? if yes how do i create a dataset for both of them Thanks