|
48 | 48 | "source": [ |
49 | 49 | "<table class=\"tfo-notebook-buttons\" align=\"left\">\n", |
50 | 50 | " <td>\n", |
51 | | - " <a target=\"_blank\" href=\"https://ai.google.dev/examples/vectordb_with_chroma\"><img src=\"https://developers.generativeai.google/static/site-assets/images/docs/notebook-site-button.png\" height=\"32\" width=\"32\" />View on Generative AI</a>\n", |
| 51 | + " <a target=\"_blank\" href=\"https://ai.google.dev/examples/vectordb_with_chroma\"><img src=\"https://ai.google.dev/static/site-assets/images/docs/notebook-site-button.png\" height=\"32\" width=\"32\" />View on Generative AI</a>\n", |
52 | 52 | " </td>\n", |
53 | 53 | " <td>\n", |
54 | 54 | " <a target=\"_blank\" href=\"https://colab.research.google.com/github/google/generative-ai-docs/blob/main/site/en/examples/vectordb_with_chroma.ipynb\"><img src=\"https://www.tensorflow.org/images/colab_logo_32px.png\" />Run in Google Colab</a>\n", |
|
79 | 79 | "\n", |
80 | 80 | "You can run this quickstart in Google Colab.\n", |
81 | 81 | "\n", |
82 | | - "To complete this quickstart on your own development environment, ensure that your envirmonement meets the following requirements:\n", |
| 82 | + "To complete this quickstart on your own development environment, ensure that your environment meets the following requirements:\n", |
83 | 83 | "\n", |
84 | 84 | "- Python 3.9+\n", |
85 | | - "- An installation of `jupyter` to run the notebook.\n", |
86 | | - "\n", |
87 | | - "## Setup\n", |
88 | | - "\n", |
89 | | - "First, download and install the Gemini API Python library." |
| 85 | + "- An installation of `jupyter` to run the notebook." |
90 | 86 | ] |
91 | 87 | }, |
92 | 88 | { |
|
95 | 91 | "id": "akuOzK4dJl3j" |
96 | 92 | }, |
97 | 93 | "source": [ |
98 | | - "## Setup\n" |
99 | | - ] |
100 | | - }, |
101 | | - { |
102 | | - "cell_type": "markdown", |
103 | | - "metadata": { |
104 | | - "id": "L47er-HZN5NI" |
105 | | - }, |
106 | | - "source": [ |
| 94 | + "## Setup\n", |
| 95 | + "\n", |
107 | 96 | "First, download and install ChromaDB and the Gemini API Python library." |
108 | 97 | ] |
109 | 98 | }, |
|
129 | 118 | "!pip install -q chromadb" |
130 | 119 | ] |
131 | 120 | }, |
| 121 | + { |
| 122 | + "cell_type": "markdown", |
| 123 | + "metadata": { |
| 124 | + "id": "jwmKt115PxK8" |
| 125 | + }, |
| 126 | + "source": [ |
| 127 | + "Then import the modules you'll use in this tutorial." |
| 128 | + ] |
| 129 | + }, |
132 | 130 | { |
133 | 131 | "cell_type": "code", |
134 | 132 | "execution_count": null, |
|
160 | 158 | "source": [ |
161 | 159 | "### Grab an API Key\n", |
162 | 160 | "\n", |
163 | | - "Before you can use the Gemini API, you must first obtain an API key. If you don't already have one, create a key with one click in MakerSuite.\n", |
| 161 | + "Before you can use the Gemini API, you must first obtain an API key. If you don't already have one, create a key with one click in Google AI Studio.\n", |
164 | 162 | "\n", |
165 | 163 | "<a class=\"button button-primary\" href=\"https://makersuite.google.com/app/apikey\" target=\"_blank\" rel=\"noopener noreferrer\">Get an API key</a>\n", |
166 | 164 | "\n", |
|
335 | 333 | "outputs": [], |
336 | 334 | "source": [ |
337 | 335 | "# Set up the DB\n", |
338 | | - "import json\n", |
339 | 336 | "db = create_chroma_db(documents, \"googlecarsdatabase\")" |
340 | 337 | ] |
341 | 338 | }, |
|
670 | 667 | }, |
671 | 668 | "outputs": [], |
672 | 669 | "source": [ |
673 | | - "# TODO: Update the query to accept an embed_fn that allows us to specify the query task_type. Running into Chroma errors.\n", |
674 | 670 | "def get_relevant_passage(query, db):\n", |
675 | 671 | " passage = db.query(query_texts=[query], n_results=1)['documents'][0][0]\n", |
676 | 672 | " return passage" |
|
709 | 705 | "id": "s8PNRMpOQkm5" |
710 | 706 | }, |
711 | 707 | "source": [ |
712 | | - "Now that you have found the relevant passage in your set of documents, you can use it make a prompt to pass into the PaLM API." |
| 708 | + "Now that you have found the relevant passage in your set of documents, you can use it make a prompt to pass into the Gemini API." |
713 | 709 | ] |
714 | 710 | }, |
715 | 711 | { |
|
739 | 735 | { |
740 | 736 | "cell_type": "markdown", |
741 | 737 | "metadata": { |
742 | | - "id": "hnHUJbE9RgwK" |
743 | | - }, |
744 | | - "source": [ |
745 | | - "The `answer` function will generate a response based on the query you have passed in. It retrieves the relevant document, and from there calls the PaLM text generation API to generate a response to the query." |
746 | | - ] |
747 | | - }, |
748 | | - { |
749 | | - "cell_type": "code", |
750 | | - "execution_count": null, |
751 | | - "metadata": { |
752 | | - "id": "NWe34VIcsf7J" |
| 738 | + "id": "hMEjbz4EswQ6" |
753 | 739 | }, |
754 | | - "outputs": [ |
755 | | - { |
756 | | - "name": "stdout", |
757 | | - "output_type": "stream", |
758 | | - "text": [ |
759 | | - "models/gemini-pro\n", |
760 | | - "models/gemini-pro-vision\n", |
761 | | - "models/gemini-ultra\n" |
762 | | - ] |
763 | | - } |
764 | | - ], |
765 | 740 | "source": [ |
766 | | - "for m in genai.list_models():\n", |
767 | | - " if 'generateContent' in m.supported_generation_methods:\n", |
768 | | - " print(m.name)" |
| 741 | + "Pass a query to the prompt:" |
769 | 742 | ] |
770 | 743 | }, |
771 | 744 | { |
|
795 | 768 | } |
796 | 769 | ], |
797 | 770 | "source": [ |
798 | | - "query = \"How do you shift gears in the Google car?\"\n", |
| 771 | + "query = \"How do you use the touchscreen in the Google car?\"\n", |
799 | 772 | "prompt = make_prompt(query, passage)\n", |
800 | 773 | "Markdown(prompt)" |
801 | 774 | ] |
802 | 775 | }, |
803 | | - { |
804 | | - "cell_type": "code", |
805 | | - "execution_count": null, |
806 | | - "metadata": { |
807 | | - "id": "EwfyxFM6Giy9" |
808 | | - }, |
809 | | - "outputs": [], |
810 | | - "source": [ |
811 | | - "model = genai.GenerativeModel('gemini-pro')\n", |
812 | | - "answer = model.generate_content(prompt)" |
813 | | - ] |
814 | | - }, |
815 | 776 | { |
816 | 777 | "cell_type": "markdown", |
817 | 778 | "metadata": { |
818 | | - "id": "hiDpAV5ScQ42" |
| 779 | + "id": "VRy6yXzcPxLB" |
819 | 780 | }, |
820 | 781 | "source": [ |
821 | | - "The temperature controls the randomness of the output. The larger the value, the more random the generated text will be." |
| 782 | + "Now use the `generate_content` method to to generate a response from the model." |
822 | 783 | ] |
823 | 784 | }, |
824 | 785 | { |
825 | 786 | "cell_type": "code", |
826 | 787 | "execution_count": null, |
827 | 788 | "metadata": { |
828 | | - "id": "IvgK5xq6HPRx" |
| 789 | + "id": "EwfyxFM6Giy9" |
829 | 790 | }, |
830 | | - "outputs": [ |
831 | | - { |
832 | | - "data": { |
833 | | - "text/markdown": [ |
834 | | - "The provided passage does not provide information about shifting gears in the Google car, so I am unable to answer your question based on this text." |
835 | | - ], |
836 | | - "text/plain": [ |
837 | | - "<IPython.core.display.Markdown object>" |
838 | | - ] |
839 | | - }, |
840 | | - "execution_count": 119, |
841 | | - "metadata": {}, |
842 | | - "output_type": "execute_result" |
843 | | - } |
844 | | - ], |
| 791 | + "outputs": [], |
845 | 792 | "source": [ |
| 793 | + "model = genai.GenerativeModel('gemini-pro')\n", |
| 794 | + "answer = model.generate_content(prompt)\n", |
846 | 795 | "Markdown(answer.text)" |
847 | 796 | ] |
848 | 797 | }, |
|
0 commit comments