Comments (11)
@cydrain
do a check on that?
from milvus.
/assign @xiaocai2333
/unassign
from milvus.
/assign @foxspy
please help on it.
from milvus.
binary index crashed during building stage
from milvus.
/assign @cydrain
from milvus.
Hi @ThreadDao,
HNSW with Binary data type has not been fully supported, and it's also not officially announced.
Please remove the "critical-urgent" tag.
from milvus.
This issue is because Milvus creates HNSW index for a binary vector filed with COSINE metric type.
COSINE metric type is only available for float vector, we need add more param legacy check.
from milvus.
can reproduce this issue using this script:
import random
import numpy as np
from pymilvus import (
connections,
FieldSchema, CollectionSchema, DataType,
Collection,
utility
)
_HOST = '127.0.0.1'
_PORT = '19530'
# Const names
_COLLECTION_NAME = 'demo'
_ID_FIELD_NAME = 'id_field'
_VECTOR_FIELD_NAME = 'bin_vector_field'
# Vector parameters
_DIM = 128
_INDEX_FILE_SIZE = 32 # max file size of stored index
_NQ = 10
# Index parameters
_METRIC_TYPE = 'COSINE'
_INDEX_TYPE = 'HNSW'
_NLIST = 128
_NPROBE = 16
_TOPK = 5
_EFC = 200
_EF = 64
_M = 8
# Create a Milvus connection
def create_connection():
print(f"\nCreate connection...")
connections.connect(host=_HOST, port=_PORT)
print(f"\nList connections:")
print(connections.list_connections())
# Create a collection named 'demo'
def create_collection(name, id_field, vector_field):
field1 = FieldSchema(name=id_field, dtype=DataType.INT64, description="int64", is_primary=True)
field2 = FieldSchema(name=vector_field, dtype=DataType.BINARY_VECTOR, description="binary vector", dim=_DIM,
is_primary=False)
schema = CollectionSchema(fields=[field1, field2], description="collection description")
collection = Collection(name=name, data=None, schema=schema, properties={"collection.ttl.seconds": 15})
print("\ncollection created:", name)
return collection
def has_collection(name):
return utility.has_collection(name)
# Drop a collection in Milvus
def drop_collection(name):
collection = Collection(name)
collection.drop()
print("\nDrop collection: {}".format(name))
# List all collections in Milvus
def list_collections():
print("\nlist collections:")
print(utility.list_collections())
def insert(collection, num, dim):
raw_vectors = []
binary_vectors = []
for i in range(num):
raw_vector = [random.randint(0, 1) for i in range(dim)]
raw_vectors.append(raw_vector)
binary_vectors.append(bytes(np.packbits(raw_vector, axis=-1).tolist()))
data = [
[i for i in range(num)],
binary_vectors,
]
collection.insert(data)
return data[1]
def get_entity_num(collection):
print("\nThe number of entity:")
print(collection.num_entities)
def create_index(collection, filed_name):
index_param = {
"index_type": _INDEX_TYPE,
"params": {"nlist": _NLIST, "M": _M, "efConstruction": _EFC},
"metric_type": _METRIC_TYPE}
collection.create_index(filed_name, index_param)
print("\nCreated index:\n{}".format(collection.index().params))
def drop_index(collection):
collection.drop_index()
print("\nDrop index sucessfully")
def load_collection(collection):
collection.load()
def release_collection(collection):
collection.release()
def search(collection, vector_field, id_field, search_vectors):
search_param = {
"data": search_vectors,
"anns_field": vector_field,
"param": {"metric_type": _METRIC_TYPE, "params": {"nprobe": _NPROBE, "ef": _EF}},
"limit": _TOPK,
"expr": "id_field >= 0"}
results = collection.search(**search_param)
for i, result in enumerate(results):
print("\nSearch result for {}th vector: ".format(i))
for j, res in enumerate(result):
print("Top {}: {}".format(j, res))
def set_properties(collection):
collection.set_properties(properties={"collection.ttl.seconds": 1800})
def main():
# create a connection
create_connection()
# drop collection if the collection exists
if has_collection(_COLLECTION_NAME):
drop_collection(_COLLECTION_NAME)
# create collection
collection = create_collection(_COLLECTION_NAME, _ID_FIELD_NAME, _VECTOR_FIELD_NAME)
# alter ttl properties of collection level
set_properties(collection)
# show collections
list_collections()
# insert 10000 vectors with 128 dimension
vectors = insert(collection, 10000, _DIM)
collection.flush()
# get the number of entities
get_entity_num(collection)
# create index
create_index(collection, _VECTOR_FIELD_NAME)
# load data to memory
load_collection(collection)
# search
search(collection, _VECTOR_FIELD_NAME, _ID_FIELD_NAME, vectors[:_NQ])
# release memory
release_collection(collection)
# drop collection index
drop_index(collection)
# drop collection
drop_collection(_COLLECTION_NAME)
if __name__ == '__main__':
main()
from milvus.
with #31825, Milvus will raise exception when create HNSW binary index.
File "/home/caiyd/work/vec/pymilvus/pymilvus/client/utils.py", line 60, in check_status
raise MilvusException(status.code, status.reason, status.error_code)
pymilvus.exceptions.MilvusException: <MilvusException: (code=1100, message=only support float vector: invalid parameter[expected=valid index params][actual=invalid index params])>
from milvus.
@ThreadDao
HNSW binary has been disabled, please have a check.
from milvus.
fixed
from milvus.
Related Issues (20)
- [Bug]: watch channel stuck forever HOT 2
- [Bug]: milvus crash after dropping a collection if compaction disabled HOT 1
- [Bug]: The decribe index api returns an incoherent response structure when using orm create index and milvusclient create index respectively HOT 1
- [Bug]: indices is empty when maping sparse float vector HOT 1
- [Bug]: pymilvus.exceptions.MilvusException: <MilvusException: (code=65535, message=unrecoverable error)> HOT 2
- [Bug]: Before exiting, make sure the goroutine has exited HOT 1
- [Bug]: err has degenerated into a new variable, which cannot be captured in defer. HOT 1
- [Bug]: Vector Search() bug HOT 9
- [Feature]: Partial load collection at field/column level
- [Bug]: querynode got restarted during test after indexcoord pod kill chaos test HOT 1
- [Bug]: Data race when clustering compaction HOT 1
- [Bug]: sparse column is sealed before append batch in mmap mode HOT 1
- [Bug]: Only one queryNode performs the loading job when the index_type is DISKANN. HOT 12
- [Bug]: Unable to load the collection HOT 19
- [Enhancement]: Enable to backup and restore rbac meta info HOT 1
- [Bug]: err cannot be captured in defer in data_sync_service HOT 1
- [Bug]: The password can contain`: `separators, which may cause errors when parsing the token as user+password. HOT 4
- [Bug]: panic in internal/flushcommon/util/timetick_sender.go HOT 2
- [Enhancement]: Add index task number for standalone milvus
- [Bug]: [benchmark][standalone] load collection raise error `collection not loaded` HOT 8
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from milvus.