1
- # Fact Checker CLI
1
+ Fact Checker CLI
2
2
3
- A command-line tool that identifies false or misleading claims in articles or statements using Perplexity' s Sonar API for web research.
3
+ A command-line tool that identifies false or misleading claims in articles or statements using Perplexity’ s Sonar API for web research.
4
4
5
- ## Features
5
+ Features
6
+ • Analyze claims or entire articles for factual accuracy
7
+ • Identify false, misleading, or unverifiable claims
8
+ • Provide explanations and corrections for inaccurate information
9
+ • Output results in human-readable format or structured JSON
10
+ • Cite reliable sources for fact-checking assessments
11
+ • Leverages Perplexity’s structured outputs for reliable JSON parsing (for Tier 3+ users)
6
12
7
- - Analyze claims or entire articles for factual accuracy
8
- - Identify false, misleading, or unverifiable claims
9
- - Provide explanations and corrections for inaccurate information
10
- - Output results in human-readable format or structured JSON
11
- - Cite reliable sources for fact-checking assessments
12
- - Leverages Perplexity's structured outputs for reliable JSON parsing (for Tier 3+ users)
13
+ Installation
14
+ 1. Install required dependencies:
13
15
14
- ## Installation
16
+ pip install requests pydantic
15
17
16
18
17
- 1 . Install required dependencies :
19
+ 2. Make the script executable :
18
20
19
- ``` bash
20
- pip install requests pydantic
21
- ```
21
+ chmod +x fact_checker.py
22
22
23
- 2 . Make the script executable:
24
23
25
- ``` bash
26
- chmod +x fact_checker.py
27
- ```
28
24
29
- ## API Key Setup
25
+ API Key Setup
30
26
31
27
The tool requires a Perplexity API key to function. You can provide it in one of these ways:
28
+ 1. As a command-line argument: --api-key YOUR_API_KEY
29
+ 2. As an environment variable: export PPLX_API_KEY=YOUR_API_KEY
30
+ 3. In a file named pplx_api_key or .pplx_api_key in the same directory as the script:
32
31
33
- 1 . As a command-line argument: ` --api-key YOUR_API_KEY `
34
- 2 . As an environment variable: ` export PPLX_API_KEY=YOUR_API_KEY `
35
- 3 . In a file named ` pplx_api_key ` or ` .pplx_api_key ` in the same directory as the script:
36
-
37
- ``` bash
38
32
# Create a file to store your API key
39
33
echo "YOUR_API_KEY" > .pplx_api_key
40
34
# Make sure to protect your API key
41
35
chmod 600 .pplx_api_key
42
- ```
43
36
44
- ** Note:** If you're using the structured outputs feature, you'll need a Perplexity API account with Tier 3 or higher access level.
45
37
46
- ## Quick Start
47
38
48
- Here's a command you can copy and run immediately after setup:
39
+ Note: If you’re using the structured outputs feature, you’ll need a Perplexity API account with Tier 3 or higher access level.
40
+
41
+ Quick Start
42
+
43
+ Here’s a command you can copy and run immediately after setup:
49
44
50
- ``` bash
51
45
# Make sure your API key is set up as described above, then run:
52
46
./fact_checker.py -t "The Earth is flat and NASA is hiding the truth."
53
- ```
54
47
55
- This will analyze the claim, research it using Perplexity' s Sonar API, and return a detailed fact check with ratings, explanations, and sources.
48
+ This will analyze the claim, research it using Perplexity’ s Sonar API, and return a detailed fact check with ratings, explanations, and sources.
56
49
57
- ## Usage
50
+ Usage
58
51
59
- ### Basic Usage
52
+ Basic Usage
60
53
61
54
Check a claim:
62
55
63
- ``` bash
64
56
./fact_checker.py --text "The Earth is flat and NASA is hiding the truth."
65
- ```
66
57
67
- ### Check an article from a file:
58
+ Check an Article from a File
68
59
69
- ``` bash
70
60
./fact_checker.py --file article.txt
71
- ```
72
61
73
- ### Specify a different model:
62
+ Specify a Different Model
74
63
75
- ``` bash
76
64
./fact_checker.py --text "Global temperatures have decreased over the past century." --model "sonar-pro"
77
- ```
78
65
79
- ### Output results as JSON:
66
+ Output Results as JSON
80
67
81
- ``` bash
82
68
./fact_checker.py --text "Mars has a breathable atmosphere." --json
83
- ```
84
69
85
- ### Use a custom prompt file:
70
+ Use a Custom Prompt File
86
71
87
- ``` bash
88
72
./fact_checker.py --text "The first human heart transplant was performed in the United States." --prompt-file custom_prompt.md
89
- ```
90
73
91
- ### Disable structured outputs (for lower tier accounts):
74
+ Enable Structured Outputs (for Tier 3+ Users)
92
75
93
- ``` bash
94
- ./fact_checker.py --text " Vaccines cause autism." --no-structured-output
95
- ```
76
+ Structured output is disabled by default. If you want to enable structured outputs (for reliable JSON parsing), pass the --structured-output flag:
96
77
97
- ### Get help:
78
+ ./fact_checker.py --text "Vaccines cause autism." --structured-output
98
79
99
- ``` bash
100
- ./fact_checker.py --help
101
- ```
80
+ Get Help
102
81
103
- ## Output Format
82
+ ./fact_checker.py --help
104
83
105
- The tool provides a structured output with:
84
+ Output Format
106
85
107
- - Overall rating of the content (MOSTLY_TRUE, MIXED, or MOSTLY_FALSE)
108
- - Summary of findings
109
- - List of specific claims with individual ratings:
110
- - TRUE: Factually accurate and supported by evidence
111
- - FALSE: Contradicted by evidence
112
- - MISLEADING: Contains some truth but presented in a way that could lead to incorrect conclusions
113
- - UNVERIFIABLE: Cannot be conclusively verified with available information
114
- - Explanations for each claim
115
- - Sources used for verification
86
+ The tool provides output including:
87
+ • Overall rating: MOSTLY_TRUE, MIXED, or MOSTLY_FALSE
88
+ • Summary: A brief overview of the fact-checking findings
89
+ • Claims Analysis: A list of specific claims with individual ratings:
90
+ • TRUE: Factually accurate and supported by evidence
91
+ • FALSE: Contradicted by evidence
92
+ • MISLEADING: Contains some truth but presented in a way that could lead to incorrect conclusions
93
+ • UNVERIFIABLE: Cannot be conclusively verified with available information
94
+ • Explanations: Detailed reasoning for each claim
95
+ • Sources: Citations and URLs used for verification
116
96
117
- ## Example
97
+ Example
118
98
119
- ```
120
99
$ ./fact_checker.py -t "The Great Wall of China is visible from the moon."
121
100
122
101
Fact checking in progress...
@@ -135,11 +114,12 @@ Claim 1: ❌ FALSE
135
114
- NASA.gov
136
115
- Scientific American
137
116
- National Geographic
138
- ```
139
117
140
- ## Limitations
118
+ Limitations
119
+ • The accuracy of fact-checking depends on the quality of information available through the Perplexity Sonar API
120
+ • Like all language models, the underlying AI may have limitations in certain specialized domains
121
+ • The structured outputs feature requires a Tier 3 or higher Perplexity API account
122
+ • The tool does not replace professional fact-checking services for highly sensitive or complex content
123
+
124
+ ⸻
141
125
142
- - The accuracy of fact-checking depends on the quality of information available through the Perplexity Sonar API
143
- - Like all language models, the underlying AI may have limitations in certain specialized domains
144
- - The structured outputs feature requires a Tier 3 or higher Perplexity API account
145
- - The tool does not replace professional fact-checking services for highly sensitive or complex content
0 commit comments