Skip to content

Commit

Permalink
correct prompt.json download hyperlink for xpu llm (#1921)
Browse files Browse the repository at this point in the history
  • Loading branch information
jingxu10 authored Aug 21, 2023
1 parent 6bad342 commit 6e0099a
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion llm/xpu/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -183,7 +183,7 @@ <h4>Preparations<a class="headerlink" href="#preparations" title="Permalink to t

<span class="c1"># Get the sample prompt.json</span>
<span class="c1"># Make sure the downloaded prompt.json file is under the same directory as that of the python scripts mentioned above.</span>
$<span class="w"> </span>wget<span class="w"> </span>https://intel-extension-for-pytorch.s3.amazonaws.com/miscellaneous/llm/prompt.txt
$<span class="w"> </span>wget<span class="w"> </span>https://intel-extension-for-pytorch.s3.amazonaws.com/miscellaneous/llm/prompt.json
</pre></div>
</div>
<p>The following environment variables are required to run llama on Intel® Data Center GPU Max Series.</p>
Expand Down

0 comments on commit 6e0099a

Please sign in to comment.