You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: Projects/3_Adversarial Search/README.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -35,18 +35,18 @@ $ source activate aind
35
35
- Download a copy of the project files from GitHub and navigate to the project folder. (Note: if you've previously downloaded the repository for another project then you can skip the clone command. However, you should run `git pull` to receive any project updates since you cloned the repository.)
(aind) $ cd "artificial-intelligence/Projects/3_Game Playing"
38
+
(aind) $ cd "artificial-intelligence/Projects/3_Adversarial Search"
39
39
```
40
40
41
41
42
42
## Instructions
43
43
44
-
You must implement an agent in the `CustomPlayer` class defined in the `game_agent.py` file. The interface definition for game agents only requires you to implement the `.get_action()` method, but you can add any other methods to the class that you deem necessary. You can build a basic agent by combining minimax search with alpha-beta pruning and iterative deepening from lecture.
44
+
You must implement an agent in the `CustomPlayer` class defined in the `my_custom_player.py` file. The interface definition for game agents only requires you to implement the `.get_action()` method, but you can add any other methods to the class that you deem necessary. You can build a basic agent by combining minimax search with alpha-beta pruning and iterative deepening from lecture.
45
45
46
46
**NOTE:** Your agent will **not** be evaluated in an environment suitable for running machine learning or deep learning agents (like AlphaGo); visit an office hours sessions **after** completing the project if you would like guidance on incorporating machine learning in your agent.
47
47
48
48
#### The get_action() Method
49
-
This function is called once per turn for each player. The calling function handles the time limit and
49
+
This function is called once per turn for each player. The calling function handles the time limit and
50
50
```
51
51
def get_action(self, state):
52
52
import random
@@ -78,7 +78,7 @@ Select at least one of the following to implement and evaluate in your report. (
78
78
79
79
- Create a performance baseline using `run_search.py` (with the `fair_matches` flag enabled) to evaluate the effectiveness of your agent using the #my_moves - #opponent_moves heuristic from lecture
80
80
- Use the same process to evaluate the effectiveness of your agent using your own custom heuristic
81
-
81
+
82
82
**Hints:**
83
83
- Research other games (chess, go, connect4, etc.) to get ideas for developing good heuristics
84
84
- If the results of your tests are very close, try increasing the number of matches (e.g., >100) to increase your confidence in the results
@@ -97,7 +97,7 @@ Select at least one of the following to implement and evaluate in your report. (
97
97
- If the results are very close, try increasing the number of matches (e.g., >100) to increase your confidence in the results
98
98
99
99
**Adding a basic opening book**
100
-
- You will need to write your own code to develop a good opening book, but you can pass data into your agent by saving the file as "data.pickle" in the same folder as `game_agent.py`. Use the [pickle](https://docs.python.org/3/library/pickle.html) module to serialize the object you want to save. The pickled object will be accessible to your agent through the `self.data` attribute.
100
+
- You will need to write your own code to develop a good opening book, but you can pass data into your agent by saving the file as "data.pickle" in the same folder as `my_custom_player.py`. Use the [pickle](https://docs.python.org/3/library/pickle.html) module to serialize the object you want to save. The pickled object will be accessible to your agent through the `self.data` attribute.
101
101
102
102
For example, the contents of dictionary `my_data` can be saved to disk:
103
103
```
@@ -115,7 +115,7 @@ with open("data.pickle", 'wb') as f:
115
115
- Create a performance baseline using `run_search.py` to evaluate the effectiveness of a baseline agent (e.g., an agent using your minimax or alpha-beta search code from the classroom)
116
116
- Use `run_search.py` to evaluate the effectiveness of your agent using your own custom search techniques
117
117
- You must decide whether to test with or without "fair" matches enabled--justify your choice in your report
118
-
118
+
119
119
**Hints:**
120
120
- If the results are very close, try increasing the number of matches (e.g., >100) to increase your confidence in the results
121
121
- Experiment with adding more search time--does adding time confer any advantage to your agent?
0 commit comments