WebAISum is a Python script that allows you to summarize web pages using AI models. It supports both local models like Ollama and remote services like OpenAI.
- Summarize web pages using AI models 🌐🤖
- Support for local models (Ollama) and remote services (OpenAI) 🏠☁️
- Customizable model selection 🔧
- Debug mode for verbose output 🐛
- Python 3.6 or higher 🐍
requests
library 📦langchain_community
library 📚langchain_openai
library 🔑
- Clone the repository:
git clone https://github.com/dkruyt/webaisum.git
- Install the required dependencies:
pip install -r requirements.txt
To summarize a web page, run the webaisum.py
script with the desired URL:
python webaisum.py https://example.com
--model
: Specify the AI model to use (default: llama3 for Ollama or gpt-4-turbo for OpenAI)--server
: Specify the base URL of the remote AI server to use (for Ollama)--debug
: Enable debug mode to print verbose output--use-openai
: Use OpenAI instead of Ollama for summarization
Summarize a web page using the default Ollama model:
python webaisum.py https://example.com
Summarize a web page using a specific Ollama model and remote server:
python webaisum.py https://example.com --model llama2 --server http://ollama-server.com
Summarize a web page using OpenAI:
python webaisum.py https://example.com --use-openai
Summarize a web page using OpenAI with a specific model:
python webaisum.py https://example.com --use-openai --model gpt-3.5-turbo-16k
Contributions are welcome! If you find any issues or have suggestions for improvements, please open an issue or submit a pull request.
This project is licensed under the MIT License.