Skip to content

Commit a439ebf

Browse files
authored
Eliminate linux/window tabs for bin/solr post. (#2579)
Also use the same style of formatting for the commands.
1 parent b87242e commit a439ebf

File tree

9 files changed

+51
-132
lines changed

9 files changed

+51
-132
lines changed

solr/solr-ref-guide/modules/deployment-guide/pages/enabling-ssl.adoc

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -435,7 +435,7 @@ You should get a response that looks like this:
435435

436436
Use `bin/solr post` to index some example documents to the SolrCloud collection created above:
437437

438-
[source,bash]
438+
[source,console]
439439
----
440440
$ bin/solr post --solr-update-url https://localhost:8984/solr/mycollection/update example/exampledocs/*.xml
441441
----
@@ -445,9 +445,9 @@ $ bin/solr post --solr-update-url https://localhost:8984/solr/mycollection/updat
445445
Use curl to query the SolrCloud collection created above, from a directory containing the PEM formatted certificate and key created above (e.g., `example/etc/`).
446446
If you have not enabled client authentication (system property `-Djetty.ssl.clientAuth=true)`, then you can remove the `-E solr-ssl.pem:secret` option:
447447

448-
[source,bash]
448+
[source,console]
449449
----
450-
curl -E solr-ssl.pem:secret --cacert solr-ssl.pem "https://localhost:8984/solr/mycollection/select?q=*:*"
450+
$ curl -E solr-ssl.pem:secret --cacert solr-ssl.pem "https://localhost:8984/solr/mycollection/select?q=*:*"
451451
----
452452

453453
=== Index a Document using CloudSolrClient

solr/solr-ref-guide/modules/getting-started/pages/tutorial-films.adoc

Lines changed: 0 additions & 43 deletions
Original file line numberDiff line numberDiff line change
@@ -222,72 +222,29 @@ Pick one of the formats and index it into the "films" collection (in each exampl
222222
.To Index JSON Format
223223
[tabs#index-json]
224224
======
225-
Linux/Mac::
226-
+
227-
====
228-
[,console]
229225
----
230226
$ bin/solr post -c films example/films/films.json
231-
232-
----
233-
====
234-
235-
Windows::
236-
+
237-
====
238-
[,console]
239-
----
240-
$ bin/solr post -c films example\films\films.json
241227
----
242-
====
243228
======
244229

245230

246231
.To Index XML Format
247232
[tabs#index-xml]
248233
======
249-
Linux/Mac::
250-
+
251-
====
252234
[,console]
253235
----
254236
$ bin/solr post -c films example/films/films.xml
255-
256-
----
257-
====
258-
259-
Windows::
260-
+
261-
====
262-
[,console]
263237
----
264-
$ bin/solr post -c films example\films\films.xml
265-
----
266-
====
267238
======
268239

269240

270241
.To Index CSV Format
271242
[tabs#index-csv]
272243
======
273-
Linux/Mac::
274-
+
275-
====
276244
[,console]
277245
----
278246
$ bin/solr post -c films example/films/films.csv -params "f.genre.split=true&f.directed_by.split=true&f.genre.separator=|&f.directed_by.separator=|"
279-
280-
----
281-
====
282-
283-
Windows::
284-
+
285-
====
286-
[,console]
287-
----
288-
$ bin/solr post -c films example\films\films.csv -params "f.genre.split=true&f.directed_by.split=true&f.genre.separator=|&f.directed_by.separator=|"
289247
----
290-
====
291248
======
292249

293250
Each command includes these main parameters:

solr/solr-ref-guide/modules/getting-started/pages/tutorial-paramsets.adoc

Lines changed: 0 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -69,26 +69,10 @@ and for the release date also to be single valued.
6969

7070
Now that we have updated our Schema, we need to index the sample film data, or, if you already have indexed it, then re-index it to take advantage of the new field definitions we added.
7171

72-
[tabs#index-json]
73-
======
74-
Linux/Mac::
75-
+
76-
====
7772
[,console]
7873
----
7974
$ bin/solr post -c films example/films/films.json
8075
----
81-
====
82-
83-
Windows::
84-
+
85-
====
86-
[,console]
87-
----
88-
$ bin/solr post -c films example\films\films.json
89-
----
90-
====
91-
======
9276

9377
=== Let's get Searching!
9478

solr/solr-ref-guide/modules/getting-started/pages/tutorial-techproducts.adoc

Lines changed: 0 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -167,18 +167,11 @@ You'll need a command shell to run some of the following examples, rooted in the
167167
The data we will index is in the `example/exampledocs` directory.
168168
The documents are in a mix of document formats (JSON, CSV, etc.), and fortunately we can index them all at once:
169169

170-
.Linux/Mac
171170
[,console]
172171
----
173172
$ bin/solr post -c techproducts example/exampledocs/*
174173
----
175174

176-
.Windows
177-
[,console]
178-
----
179-
$ bin/solr post -c techproducts example\exampledocs\*
180-
----
181-
182175
You should see output similar to the following:
183176

184177
[,console]

solr/solr-ref-guide/modules/getting-started/pages/tutorial-vectors.adoc

Lines changed: 0 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -75,27 +75,10 @@ $ curl http://localhost:8983/solr/films/schema -X POST -H 'Content-type:applicat
7575

7676
We have the vectors embedded in our `films.json` file, so let's index that data, taking advantage of our new schema field we just defined.
7777

78-
[tabs#index-json]
79-
======
80-
Linux/Mac::
81-
+
82-
====
8378
[,console]
8479
----
8580
$ bin/solr post -c films example/films/films.json
86-
8781
----
88-
====
89-
90-
Windows::
91-
+
92-
====
93-
[,console]
94-
----
95-
$ bin/solr post -c films example\films\films.json
96-
----
97-
====
98-
======
9982

10083
=== Let's do some Vector searches
10184
Before making the queries, we define an example target vector, simulating a person that

solr/solr-ref-guide/modules/indexing-guide/pages/indexing-with-tika.adoc

Lines changed: 12 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -126,9 +126,9 @@ Note this includes the path, so if you upload a different file, always be sure t
126126

127127
You can also use `bin/solr post` to do the same thing:
128128

129-
[source,bash]
129+
[,console]
130130
----
131-
bin/solr post -c gettingstarted example/exampledocs/solr-word.pdf -params "literal.id=doc1"
131+
$ bin/solr post -c gettingstarted example/exampledocs/solr-word.pdf -params "literal.id=doc1"
132132
----
133133

134134
Now you can execute a query and find that document with a request like `\http://localhost:8983/solr/gettingstarted/select?q=pdf`.
@@ -146,9 +146,9 @@ The dynamic field `ignored_*` is good for this purpose.
146146
For the fields you do want to map, explicitly set them using `fmap.IN=OUT` and/or ensure the field is defined in the schema.
147147
Here's an example:
148148

149-
[source,bash]
149+
[,console]
150150
----
151-
bin/solr post -c gettingstarted example/exampledocs/solr-word.pdf -params "literal.id=doc1&uprefix=ignored_&fmap.last_modified=last_modified_dt"
151+
$ bin/solr post -c gettingstarted example/exampledocs/solr-word.pdf -params "literal.id=doc1&uprefix=ignored_&fmap.last_modified=last_modified_dt"
152152
----
153153

154154
[NOTE]
@@ -561,18 +561,18 @@ If `literalsOverride=false`, literals will be appended as multi-value to the Tik
561561

562562
The command below captures `<div>` tags separately (`capture=div`), and then maps all the instances of that field to a dynamic field named `foo_t` (`fmap.div=foo_t`).
563563

564-
[source,bash]
564+
[,console]
565565
----
566-
bin/solr post -c gettingstarted example/exampledocs/sample.html -params "literal.id=doc2&captureAttr=true&defaultField=_text_&fmap.div=foo_t&capture=div"
566+
$ bin/solr post -c gettingstarted example/exampledocs/sample.html -params "literal.id=doc2&captureAttr=true&defaultField=_text_&fmap.div=foo_t&capture=div"
567567
----
568568

569569
=== Using Literals to Define Custom Metadata
570570

571571
To add in your own metadata, pass in the literal parameter along with the file:
572572

573-
[source,bash]
573+
[,console]
574574
----
575-
bin/solr post -c gettingstarted -params "literal.id=doc4&captureAttr=true&defaultField=text&capture=div&fmap.div=foo_t&literal.blah_s=Bah" example/exampledocs/sample.html
575+
$ bin/solr post -c gettingstarted -params "literal.id=doc4&captureAttr=true&defaultField=text&capture=div&fmap.div=foo_t&literal.blah_s=Bah" example/exampledocs/sample.html
576576
----
577577

578578
The parameter `literal.blah_s=Bah` will insert a field `blah_s` into every document.
@@ -582,9 +582,9 @@ Every instance of the text will be "Bah".
582582

583583
The example below passes in an XPath expression to restrict the XHTML returned by Tika:
584584

585-
[source,bash]
585+
[,console]
586586
----
587-
bin/solr post -c gettingstarted -params "literal.id=doc5&captureAttr=true&defaultField=text&capture=div&fmap.div=foo_t&xpath=/xhtml:html/xhtml:body/xhtml:div//node()" example/exampledocs/sample.html
587+
$ bin/solr post -c gettingstarted -params "literal.id=doc5&captureAttr=true&defaultField=text&capture=div&fmap.div=foo_t&xpath=/xhtml:html/xhtml:body/xhtml:div//node()" example/exampledocs/sample.html
588588
----
589589

590590
=== Extracting Data without Indexing
@@ -601,9 +601,9 @@ curl "http://localhost:8983/solr/gettingstarted/update/extract?&extractOnly=true
601601

602602
The output includes XML generated by Tika (and further escaped by Solr's XML) using a different output format to make it more readable (`-out yes` instructs the tool to echo Solr's output to the console):
603603

604-
[source,bash]
604+
[,console]
605605
----
606-
bin/solr post -c gettingstarted -params "extractOnly=true&wt=ruby&indent=true" -out yes example/exampledocs/sample.html
606+
$ bin/solr post -c gettingstarted -params "extractOnly=true&wt=ruby&indent=true" -out yes example/exampledocs/sample.html
607607
----
608608

609609
=== Using Solr Cell with a POST Request

solr/solr-ref-guide/modules/indexing-guide/pages/post-tool.adoc

Lines changed: 26 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -110,48 +110,48 @@ This section presents several examples.
110110

111111
Index all JSON files into `gettingstarted`.
112112

113-
[source,bash]
113+
[,console]
114114
----
115-
bin/solr post -url http://localhost:8983/solr/gettingstarted/update *.json
115+
$ bin/solr post -url http://localhost:8983/solr/gettingstarted/update *.json
116116
----
117117

118118
=== Indexing XML
119119

120120
Add all documents with file extension `.xml` to the collection named `gettingstarted`.
121121

122-
[source,bash]
122+
[,console]
123123
----
124-
bin/solr post -url http://localhost:8983/solr/gettingstarted/update *.xml
124+
$ bin/solr post -url http://localhost:8983/solr/gettingstarted/update *.xml
125125
----
126126

127127
Add all documents starting with `article` with file extension `.xml` to the `gettingstarted` collection on Solr running on port `8984`.
128128

129-
[source,bash]
129+
[,console]
130130
----
131-
bin/solr post -url http://localhost:8984/solr/gettingstarted/update article*.xml
131+
$ bin/solr post -url http://localhost:8984/solr/gettingstarted/update article*.xml
132132
----
133133

134134
Send XML arguments to delete a document from `gettingstarted`.
135135

136-
[source,bash]
136+
[,console]
137137
----
138-
bin/solr post -url http://localhost:8983/solr/gettingstarted/update -mode args -type application/xml '<delete><id>42</id></delete>'
138+
$ bin/solr post -url http://localhost:8983/solr/gettingstarted/update -mode args -type application/xml '<delete><id>42</id></delete>'
139139
----
140140

141141
=== Indexing CSV and JSON
142142

143143
Index all CSV and JSON files into `gettingstarted` from current directory:
144144

145-
[source,bash]
145+
[,console]
146146
----
147-
bin/solr post -c gettingstarted -filetypes json,csv .
147+
$ bin/solr post -c gettingstarted -filetypes json,csv .
148148
----
149149

150150
Index a tab-separated file into `gettingstarted`:
151151

152-
[source,bash]
152+
[,console]
153153
----
154-
bin/solr post -url http://localhost:8984/solr/signals/update -params "separator=%09" -type text/csv data.tsv
154+
$ bin/solr post -url http://localhost:8984/solr/signals/update -params "separator=%09" -type text/csv data.tsv
155155
----
156156

157157
The content type (`-type`) parameter is required to treat the file as the proper type, otherwise it will be ignored and a WARNING logged as it does not know what type of content a .tsv file is.
@@ -161,32 +161,32 @@ The xref:indexing-with-update-handlers.adoc#csv-formatted-index-updates[CSV hand
161161

162162
Index a PDF file into `gettingstarted`.
163163

164-
[source,bash]
164+
[,console]
165165
----
166-
bin/solr post -url http://localhost:8983/solr/gettingstarted/update a.pdf
166+
$ bin/solr post -url http://localhost:8983/solr/gettingstarted/update a.pdf
167167
----
168168

169169
Automatically detect content types in a folder, and recursively scan it for documents for indexing into `gettingstarted`.
170170

171-
[source,bash]
171+
[,console]
172172
----
173-
bin/solr post -url http://localhost:8983/solr/gettingstarted/update afolder/
173+
$ bin/solr post -url http://localhost:8983/solr/gettingstarted/update afolder/
174174
----
175175

176176
Automatically detect content types in a folder, but limit it to PPT and HTML files and index into `gettingstarted`.
177177

178-
[source,bash]
178+
[,console]
179179
----
180-
bin/solr post -url http://localhost:8983/solr/gettingstarted/update -filetypes ppt,html afolder/
180+
$ bin/solr post -url http://localhost:8983/solr/gettingstarted/update -filetypes ppt,html afolder/
181181
----
182182

183183
=== Indexing to a Password Protected Solr (Basic Auth)
184184

185185
Index a PDF as the user "solr" with password "SolrRocks":
186186

187-
[source,bash]
187+
[,console]
188188
----
189-
bin/solr post -u solr:SolrRocks -url http://localhost:8983/solr/gettingstarted/update a.pdf
189+
$ bin/solr post -u solr:SolrRocks -url http://localhost:8983/solr/gettingstarted/update a.pdf
190190
----
191191

192192
=== Crawling a Website to Index Documents
@@ -195,26 +195,26 @@ Crawl the Apache Solr website going one layer deep and indexing the pages into S
195195

196196
See xref:indexing-with-tika.adoc#trying-out-solr-cell[Trying Out Solr Cell] to learn more about setting up Solr for extracting content from web pages.
197197

198-
[source,bash]
198+
[,console]
199199
----
200-
bin/solr post -mode web -c gettingstarted -recursive 1 -delay 1 https://solr.apache.org/
200+
$ bin/solr post -mode web -c gettingstarted -recursive 1 -delay 1 https://solr.apache.org/
201201
----
202202

203203
=== Standard Input as Source for Indexing
204204

205205
You can use the standard input as your source for data to index.
206206
Notice the `-out` providing raw responses from Solr.
207207

208-
[source,bash]
208+
[,console]
209209
----
210-
echo '{commit: {}}' | bin/solr post -mode stdin -url http://localhost:8983/my_collection/update -out
210+
$ echo '{commit: {}}' | bin/solr post -mode stdin -url http://localhost:8983/my_collection/update -out
211211
----
212212

213213
=== Raw Data as Source for Indexing
214214

215215
Provide the raw document as a string for indexing.
216216

217-
[source,bash]
217+
[,console]
218218
----
219-
bin/solr post -url http://localhost:8983/signals/update -mode args -type text/csv -out $'id,value\n1,0.47'
219+
$ bin/solr post -url http://localhost:8983/signals/update -mode args -type text/csv -out $'id,value\n1,0.47'
220220
----

solr/solr-ref-guide/modules/query-guide/pages/spatial-search.adoc

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -70,8 +70,10 @@ However, it's much bulkier than the raw coordinates for such simple data.
7070

7171
Using the `bin/solr post` tool:
7272

73-
[source,text]
74-
bin/solr post -type "application/json" -url "http://localhost:8983/solr/mycollection/update?format=geojson" /path/to/geojson.file
73+
[,console]
74+
----
75+
$ bin/solr post -type "application/json" -url "http://localhost:8983/solr/mycollection/update?format=geojson" /path/to/geojson.file
76+
----
7577

7678
The key parameter to pass in with your request is:
7779

0 commit comments

Comments
 (0)