back to listing index

roscopecoltran/gcse

[web search]
Original source (github.com)
Tags: golang go search-engine
Clipped on: 2018-04-05
Project for Go Search, a search engine for finding popular and relevant packages. http://go-search.org/
Go HTML CSS JavaScript Makefile Batchfile Shell
Pull request Compare This branch is 9 commits ahead of daviddengcn:master.
cmd remove data dirs 2 months ago
dist remove data dirs 2 months ago
front/chrome-app re-organize pkg and executables directories 2 months ago
pkg remove data dirs 2 months ago
scripts disptach scripts per platform 2 months ago
shared/data/imports/0 remove data dirs 2 months ago
.env-example prepare dockerfile, docker-compose, env file 2 months ago
.gitignore remove data dirs 2 months ago
ACKNOWLEDGEMENTS Create ACKNOWLEDGEMENTS 4 years ago
LICENSE Create LICENSE 4 years ago
Makefile remove data dirs 2 months ago
README.md add more targets to Makefile 2 months ago
conf.json-example remove data dirs 2 months ago
conf.yaml-example remove data dirs 2 months ago
crane.yml prepare dockerfile, docker-compose, env file 2 months ago
crawler remove data dirs 2 months ago
docker-compose.yml prepare dockerfile, docker-compose, env file 2 months ago
glide.lock add glide.yaml, update gitignore 2 months ago
glide.yaml add glide.yaml, update gitignore 2 months ago
indexer remove data dirs 2 months ago
license.txt Various minor update for spelling and links. 5 years ago
mergedocs remove data dirs 2 months ago
server remove data dirs 2 months ago
spider remove data dirs 2 months ago

README.md

Go Search

A keyword search engine helping people to find popular and relevant Go packages.

Online service: Go Search

This is the root package with shared functions.

Sub packages are commands for running:

  • HTTP Server: Searching and web service
  • ToCrawl: Find packages to crawl.
  • Crawler: Crawling package files.
  • MergeDocs: Merge crawled package files with doc DB.
  • Indexer: Analyzing package information and generating indexed data for searching.

Development

You'll need to perform the following steps to get a basic server running:

  1. Create a basic conf.json file, limiting the crawler to a one minute run: { "crawler": { "due_per_run": "1m" } }
  2. Run the package finder: go run cmd/tocrawl/*.go
  3. Run the crawler: go run cmd/crawler/*.go
  4. Merge the crawled docs: go run cmd/mergedocs/*.go
  5. Run the indexer: go run cmd/indexer/*.go
  6. Run the server: go run cmd/server/*.go
  7. Visit http://localhost:8080 in your browser

LICENSE

BSD license.

Press h to open a hovercard with more details.