Skip to content

Commit 07a3fc0

Browse files
authored
Removes multiple newlines at the end of files that is breaking the editorconfig step of CI. (#8258)
1 parent 9689673 commit 07a3fc0

File tree

22 files changed

+0
-24
lines changed

22 files changed

+0
-24
lines changed

.github/ISSUE_TEMPLATE/config.yml

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,5 +9,3 @@ contact_links:
99
- name: Want to contribute?
1010
url: https://github.com/ggerganov/llama.cpp/wiki/contribute
1111
about: Head to the contribution guide page of the wiki for areas you can help with
12-
13-

common/common.h

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -459,4 +459,3 @@ void yaml_dump_string_multiline(FILE * stream, const char * prop_name, const cha
459459
void yaml_dump_non_result_info(
460460
FILE * stream, const gpt_params & params, const llama_context * lctx,
461461
const std::string & timestamp, const std::vector<int> & prompt_tokens, const char * model_desc);
462-

examples/embedding/README.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -58,4 +58,3 @@ The above command will output space-separated float values.
5858
```powershell
5959
embedding.exe -p 'Castle<#sep#>Stronghold<#sep#>Dog<#sep#>Cat' --embd-separator '<#sep#>' --embd-normalize 2 --embd-output-format '' -m './path/to/model.gguf' --n-gpu-layers 99 --log-disable 2>/dev/null
6060
```
61-

examples/infill/infill.cpp

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -659,4 +659,3 @@ int main(int argc, char ** argv) {
659659

660660
return 0;
661661
}
662-

examples/lookup/README.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,4 +10,3 @@ More info:
1010

1111
https://github.com/ggerganov/llama.cpp/pull/4484
1212
https://github.com/ggerganov/llama.cpp/issues/4226
13-

examples/main-cmake-pkg/.gitignore

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -48,4 +48,3 @@
4848
build*/
4949
out/
5050
tmp/
51-

examples/main-cmake-pkg/CMakeLists.txt

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,4 +30,3 @@ target_include_directories(${TARGET} PRIVATE ${_common_path})
3030
install(TARGETS ${TARGET} RUNTIME)
3131
target_link_libraries(${TARGET} PRIVATE common llama ${CMAKE_THREAD_LIBS_INIT})
3232
target_compile_features(${TARGET} PRIVATE cxx_std_11)
33-

examples/server-embd.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,4 +31,3 @@ async def main():
3131
embedding2 = np.array(result[j])
3232
similarity = np.dot(embedding1, embedding2) / (np.linalg.norm(embedding1) * np.linalg.norm(embedding2))
3333
print(f"Similarity between {i} and {j}: {similarity:.2f}")
34-

examples/server/tests/features/passkey.feature

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -52,4 +52,3 @@ Feature: Passkey / Self-extend with context shift
5252
#| TheBloke/Llama-2-7B-GGUF | llama-2-7b.Q2_K.gguf | 4096 | 3 | 16384 | 512 | 4 | 512 | 500 | 300 | 1234 | 5 | 1234 |
5353
#| TheBloke/Mixtral-8x7B-v0.1-GGUF | mixtral-8x7b-v0.1.Q2_K.gguf | 32768 | 2 | 16384 | 512 | 4 | 512 | 500 | 100 | 0987 | 5 | 0
5454
# 987 |
55-

examples/server/themes/buttons-top/index.html

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1054,4 +1054,3 @@ <h1>llama.cpp</h1>
10541054
</body>
10551055

10561056
</html>
1057-

0 commit comments

Comments
 (0)