Code Monkey home page Code Monkey logo

teamgram-server's People

Contributors

2ei avatar abirdcfly avatar alikhadivi avatar blogcloud avatar devnulldevzero avatar haupc avatar iineva avatar libi avatar lionpuchipuchi avatar mittwillson avatar ngocanh1909 avatar qusonann avatar teamgramio avatar techiedesu avatar testwill avatar wubenqi avatar yeehaw456789 avatar zig5000 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

teamgram-server's Issues

运行frontend模块报错

我按照文档编译和运行顺序操作时,在最后运行frontend模块时提示端口占用:{0xc0000f71e0 /nebulaim/frontend80/node1 {"Addr":"127.0.0.1:10000","Metadata":{}} 10000000000 0xc00009d500 0x593570}
F0529 21:03:22.584240 18720 mtproto_server.go:66] Endpoint already exists for id: client-bufv8c5dtrq8,查看源代码文件中如下:
image,而在session模块中配置文件中端口号也是10000,不知道是不是有问题?
image

frontend issue `key not found`

Problem

Session cannot be found after some time.

The reason is:

After some time services cannot find each other

Current Solution

Restart other services when frontend logged text key not found

Ideal Solution

It should never heppen

Not an issue

Hello , This’s not an issue but some general questions :
1- What does bff stand for ? ( naming convention ! )
2- The build doc is OLD ( not updated with the new project design architect ) Correct me if i’m wrong ! .
3- Personally i liked the OLD project Design architect ; well organized ( packages … etc ) ; Not talking about the code .

thank you for all your efforts and time .

客户端无法连接服务器呢?

我在windows10下,和ubuntu18.04下都部署了,测试发现客户端连不上呢?能解答一下么?
在frontend中提示:

I0407 09:07:46.337977 10096 mtproto_server.go:94] onNewConnection 192.168.1.23:6044
I0407 09:07:46.338954 10096 server.go:140] onServerNewConnection - {peer: {connID: 58@frontend443-(192.168.1.23:12345->192.168.1.23:6044)}, ctx: {&{{0 0} 256 0xc00030ce10 0}}}
I0407 09:07:46.340953 10096 mtproto_proxy_codec.go:236] first_bytes_64: c7d428f9fe2f4b3da6f530529833f967a1db6dab1bb19a0e6e39e382dec363d685bafac770866d8bd229404cecdaefcaf091b35e88dbe5b6efefefef0100f79b
I0407 09:07:46.340953 10096 mtproto_app_codec.go:63] size1: 40
I0407 09:07:46.340953 10096 mtproto_server.go:102] onConnectionDataArrived 192.168.1.23:6044
I0407 09:07:46.342951 10096 server.go:145] onServerMessageDataArrived - receive data: {peer: {connID: 58@frontend443-(192.168.1.23:12345->192.168.1.23:6044)}, md: {conn_type: 1, auth_key_id: 0, quick_ack_id: 0, payload_len: 40}, msg: %!s(MISSING)}
I0407 09:07:46.343951 10096 server.go:301] onServerUnencryptedRawMessage - receive data: {peer: {connID: 58@frontend443-(192.168.1.23:12345->192.168.1.23:6044)}, ctx: &{{%!s(int32=0) %!s(uint32=0)} %!s(int=256) %!s(*mtproto.TLHandshakeData=&{0xc0009085f0 {} [] 0}) %!s(int64=0)}, msg: {conn_type: 1, auth_key_id: 0, quick_ack_id: 0, payload_len: 40}}
I0407 09:07:46.343951 10096 server.go:340] sendMessage - handshake: {peer: {connID: 58@frontend443-(192.168.1.23:12345->192.168.1.23:6044)}, md: request:<service_name:"handshake" method_name:"mtproto.TLHandshakeData" log_id:1247330190485360640 trace_id:1247330190485360642 span_id:1247330190485360643 > correlation_id:1247330190485360641 attachment_size:40 mtproto_meta:<auth_key_id:0 server_id:1 client_conn_id:58 client_addr:"192.168.1.23:6044" from:"frontend" receive_time:1586221666343 > , msg: data2:<state:513 client_conn_id:58 ctx:<constructor:CRC32_handshakeContext data2:<> > > }
I0407 09:07:46.349948 10096 server.go:223] onClientHandshakeMessage - handshake: peer({connID: 1@handshake-(127.0.0.1:5702->127.0.0.1:10005)}), state: {data2:<state:514 res_state:1 client_conn_id:58 ctx:<constructor:CRC32_handshakeContext data2:<nonce:"XM\240\027C\311\375\207\221\275\036}\341d\313\267" server_nonce:"\020uW\342\277\003\242?\004\263\210g^\254\277" > > > } I0407 09:07:46.349948 10096 server.go:245] onClientHandshakeMessage - sendToClient to: {peer: {connID: 58@frontend443-(192.168.1.23:12345->192.168.1.23:6044)}, handshake: data2:<state:514 res_state:1 client_conn_id:58 ctx:<constructor:CRC32_handshakeContext data2:<nonce:"XM\240\027C\311\375\207\221\275\036}\341d\313\267" server_nonce:"\020uW\342\277\003\242?\004\263\210g^\254\277" > > > }
I0407 09:07:46.356944 10096 mtproto_app_codec.go:63] size1: 88
I0407 09:07:46.356944 10096 mtproto_server.go:102] onConnectionDataArrived 192.168.1.23:6044
I0407 09:07:46.357944 10096 server.go:145] onServerMessageDataArrived - receive data: {peer: {connID: 58@frontend443-(192.168.1.23:12345->192.168.1.23:6044)}, md: {conn_type: 1, auth_key_id: 7899566645959756554, quick_ack_id: 0, payload_len: 88}, msg: %!s(MISSING)}
I0407 09:07:46.357944 10096 server.go:345] onServerEncryptedRawMessage - receive data: {peer: {connID: 58@frontend443-(192.168.1.23:12345->192.168.1.23:6044)}, ctx: &{{%!s(int32=0) %!s(uint32=0)} %!s(int=512) %!s(*mtproto.TLHandshakeData=&{0xc000713900 {} [] 0}) %!s(int64=0)}, msg: {conn_type: 1, auth_key_id: 7899566645959756554, quick_ack_id: 0, payload_len: 88}}
E0407 09:07:46.358957 10096 zrpc_client.go:140] not found kaddr by key: 7899566645959756554
I0407 09:07:48.825550 10096 mtproto_server.go:119] onConnectionClosed - 192.168.1.23:6044
I0407 09:07:48.835371 10096 server.go:176] onServerConnectionClosed - {peer: {connID: 58@frontend443-(192.168.1.23:12345->192.168.1.23:6044)}}

fatal error: runtime: out of memory

Not compiling, please help!

Traceback:

# github.com/nebula-chat/chatengine/mtproto
fatal error: runtime: out of memory

runtime stack:
runtime.throw(0xe44822, 0x16)
        /usr/local/go/src/runtime/panic.go:774 +0x72
runtime.sysMap(0xc014000000, 0x4000000, 0x15c6938)
        /usr/local/go/src/runtime/mem_linux.go:169 +0xc5
runtime.(*mheap).sysAlloc(0x159e940, 0x2000, 0x41c922, 0x7fbf1be36008)
        /usr/local/go/src/runtime/malloc.go:701 +0x1cd
runtime.(*mheap).grow(0x159e940, 0x1, 0xffffffff)
        /usr/local/go/src/runtime/mheap.go:1252 +0x42
runtime.(*mheap).allocSpanLocked(0x159e940, 0x1, 0x15c6948, 0xc000021320)
        /usr/local/go/src/runtime/mheap.go:1163 +0x291
runtime.(*mheap).alloc_m(0x159e940, 0x1, 0x7ffc0fa70012, 0x45892a)
        /usr/local/go/src/runtime/mheap.go:1015 +0xc2
runtime.(*mheap).alloc.func1()
        /usr/local/go/src/runtime/mheap.go:1086 +0x4c
runtime.systemstack(0x45a284)
        /usr/local/go/src/runtime/asm_amd64.s:370 +0x66
runtime.mstart()
        /usr/local/go/src/runtime/proc.go:1146

goroutine 1 [running]:
runtime.systemstack_switch()
        /usr/local/go/src/runtime/asm_amd64.s:330 fp=0xc0132d2478 sp=0xc0132d247                                                                                                 0 pc=0x45a380
runtime.(*mheap).alloc(0x159e940, 0x1, 0xc00d010012, 0xc0132d2628)
        /usr/local/go/src/runtime/mheap.go:1085 +0x8a fp=0xc0132d24c8 sp=0xc0132                                                                                                 d2478 pc=0x424f1a
runtime.(*mcentral).grow(0x159f160, 0x0)
        /usr/local/go/src/runtime/mcentral.go:255 +0x7b fp=0xc0132d2508 sp=0xc01                                                                                                 32d24c8 pc=0x416f7b
runtime.(*mcentral).cacheSpan(0x159f160, 0x203000)
        /usr/local/go/src/runtime/mcentral.go:106 +0x2fe fp=0xc0132d2568 sp=0xc0                                                                                                 132d2508 pc=0x416a9e
runtime.(*mcache).refill(0x7fbf1be29008, 0x12)
        /usr/local/go/src/runtime/mcache.go:138 +0x85 fp=0xc0132d2588 sp=0xc0132                                                                                                 d2568 pc=0x416545
runtime.(*mcache).nextFree(0x7fbf1be29008, 0xc0132d2512, 0x4435cc, 0x0, 0xe31ca0                                                                                                 )
        /usr/local/go/src/runtime/malloc.go:854 +0x87 fp=0xc0132d25c0 sp=0xc0132                                                                                                 d2588 pc=0x40b797
runtime.mallocgc(0x80, 0xe22d40, 0xfccc01, 0x0)
        /usr/local/go/src/runtime/malloc.go:1022 +0x793 fp=0xc0132d2660 sp=0xc01                                                                                                 32d25c0 pc=0x40c0d3
runtime.newobject(0xe22d40, 0x0)
        /usr/local/go/src/runtime/malloc.go:1151 +0x38 fp=0xc0132d2690 sp=0xc013                                                                                                 2d2660 pc=0x40c4c8
cmd/compile/internal/gc.(*Node).copy(...)
        /usr/local/go/src/cmd/compile/internal/gc/subr.go:397
cmd/compile/internal/gc.(*inlsubst).node(0xc0132d2f38, 0xc000b7f180, 0x321f00000                                                                                                 51a6)
        /usr/local/go/src/cmd/compile/internal/gc/inl.go:1255 +0x91b fp=0xc0132d                                                                                                 2898 sp=0xc0132d2690 pc=0xc4304b
cmd/compile/internal/gc.(*inlsubst).node(0xc0132d2f38, 0xc000b7f100, 0xc013fffb8                                                                                                 0)
        /usr/local/go/src/cmd/compile/internal/gc/inl.go:1263 +0x9f8 fp=0xc0132d                                                                                                 2aa0 sp=0xc0132d2898 pc=0xc43128
cmd/compile/internal/gc.(*inlsubst).list(0xc0132d2f38, 0xc0132d2e90, 0x1, 0x4, 0                                                                                                 xc0132d2da0)
        /usr/local/go/src/cmd/compile/internal/gc/inl.go:1182 +0xe0 fp=0xc0132d2                                                                                                 b28 sp=0xc0132d2aa0 pc=0xc42660
cmd/compile/internal/gc.mkinlcall(0xc00653e400, 0xc0072ba000, 0x50, 0xc013ffe900                                                                                                 )
        /usr/local/go/src/cmd/compile/internal/gc/inl.go:1074 +0x2d28 fp=0xc0132                                                                                                 d2fc0 sp=0xc0132d2b28 pc=0xc40548
cmd/compile/internal/gc.inlnode(0xc00653e400, 0x50, 0xc013ffe900)
        /usr/local/go/src/cmd/compile/internal/gc/inl.go:696 +0xafa fp=0xc0132d3                                                                                                 1c0 sp=0xc0132d2fc0 pc=0xc3cb0a
cmd/compile/internal/gc.inlnodelist(0xc006529b60, 0x50)
        /usr/local/go/src/cmd/compile/internal/gc/inl.go:528 +0x6e fp=0xc0132d32                                                                                                 00 sp=0xc0132d31c0 pc=0xc3bfbe
cmd/compile/internal/gc.inlnode(0xc00653c000, 0x50, 0xc0132d3430)
        /usr/local/go/src/cmd/compile/internal/gc/inl.go:623 +0x377 fp=0xc0132d3                                                                                                 400 sp=0xc0132d3200 pc=0xc3c387
cmd/compile/internal/gc.inlcalls(0xc00653c000)
        /usr/local/go/src/cmd/compile/internal/gc/inl.go:484 +0x77 fp=0xc0132d34                                                                                                 40 sp=0xc0132d3400 pc=0xc3b8d7
cmd/compile/internal/gc.Main.func6(0xc0013c4108, 0x1, 0xf, 0xc0130f5c00)
        /usr/local/go/src/cmd/compile/internal/gc/main.go:632 +0x3c fp=0xc0132d3                                                                                                 4d0 sp=0xc0132d3440 pc=0xd32ebc
cmd/compile/internal/gc.(*bottomUpVisitor).visit(0xc0132d37a8, 0xc00653c000, 0x0                                                                                                 )
        /usr/local/go/src/cmd/compile/internal/gc/scc.go:118 +0x295 fp=0xc0132d3                                                                                                 550 sp=0xc0132d34d0 pc=0xc96335
cmd/compile/internal/gc.(*bottomUpVisitor).visit.func1(0xc00652ab00, 0xc0132d35d                                                                                                 0)
        /usr/local/go/src/cmd/compile/internal/gc/scc.go:81 +0xaf fp=0xc0132d35a                                                                                                 8 sp=0xc0132d3550 pc=0xd349ff
cmd/compile/internal/gc.inspect(0xc00652ab00, 0xc0132d3760)
        /usr/local/go/src/cmd/compile/internal/gc/syntax.go:912 +0xfc fp=0xc0132                                                                                                 d35c8 sp=0xc0132d35a8 pc=0xcf026c
cmd/compile/internal/gc.inspectList(0xc0065283a0, 0xc0132d3760)
        /usr/local/go/src/cmd/compile/internal/gc/syntax.go:925 +0x58 fp=0xc0132                                                                                                 d3600 sp=0xc0132d35c8 pc=0xcf02e8
cmd/compile/internal/gc.inspect(0xc00652ab80, 0xc0132d3760)
        /usr/local/go/src/cmd/compile/internal/gc/syntax.go:918 +0xac fp=0xc0132                                                                                                 d3620 sp=0xc0132d3600 pc=0xcf021c
cmd/compile/internal/gc.inspectList(0xc0065283c0, 0xc0132d3760)
        /usr/local/go/src/cmd/compile/internal/gc/syntax.go:925 +0x58 fp=0xc0132                                                                                                 d3658 sp=0xc0132d3620 pc=0xcf02e8
cmd/compile/internal/gc.inspect(0xc00652a780, 0xc0132d3760)
        /usr/local/go/src/cmd/compile/internal/gc/syntax.go:919 +0xc8 fp=0xc0132                                                                                                 d3678 sp=0xc0132d3658 pc=0xcf0238
cmd/compile/internal/gc.inspectList(0xc006528500, 0xc0132d3760)
        /usr/local/go/src/cmd/compile/internal/gc/syntax.go:925 +0x58 fp=0xc0132                                                                                                 d36b0 sp=0xc0132d3678 pc=0xcf02e8
cmd/compile/internal/gc.inspect(0xc00652a280, 0xc0132d3760)
        /usr/local/go/src/cmd/compile/internal/gc/syntax.go:918 +0xac fp=0xc0132                                                                                                 d36d0 sp=0xc0132d36b0 pc=0xcf021c
cmd/compile/internal/gc.inspectList(0xc006528520, 0xc0132d3760)
        /usr/local/go/src/cmd/compile/internal/gc/syntax.go:925 +0x58 fp=0xc0132                                                                                                 d3708 sp=0xc0132d36d0 pc=0xcf02e8
cmd/compile/internal/gc.(*bottomUpVisitor).visit(0xc0132d37a8, 0xc006516580, 0xb                                                                                                 fc3814800004106)
        /usr/local/go/src/cmd/compile/internal/gc/scc.go:76 +0x13d fp=0xc0132d37                                                                                                 88 sp=0xc0132d3708 pc=0xc961dd
cmd/compile/internal/gc.visitBottomUp(0xc00bc64000, 0x8552, 0x8c00, 0xe5bf88)
        /usr/local/go/src/cmd/compile/internal/gc/scc.go:58 +0x95 fp=0xc0132d37e                                                                                                 8 sp=0xc0132d3788 pc=0xc96075
cmd/compile/internal/gc.Main(0xe5be50)
        /usr/local/go/src/cmd/compile/internal/gc/main.go:623 +0x3a6f fp=0xc0132                                                                                                 d3ee8 sp=0xc0132d37e8 pc=0xc4995f
main.main()
        /usr/local/go/src/cmd/compile/main.go:51 +0xac fp=0xc0132d3f60 sp=0xc013                                                                                                 2d3ee8 pc=0xd7fddc
runtime.main()
        /usr/local/go/src/runtime/proc.go:203 +0x21e fp=0xc0132d3fe0 sp=0xc0132d                                                                                                 3f60 pc=0x42f35e
runtime.goexit()
        /usr/local/go/src/runtime/asm_amd64.s:1357 +0x1 fp=0xc0132d3fe8 sp=0xc01                                                                                                 32d3fe0 pc=0x45c2d1

Go: 1.13 and 1.14.7
OS: Linux 3.10.0-1127.18.2.el7.x86_64 #1 SMP Sun Jul 26 15:27:06 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux
CPU: 1xQEMU Virtual CPU version 2.5+ (2400Mhz, Xeon E5)
RAM: 512Mb
HARD: 7GB SSD

session:fatal error: concurrent map read and map write

2020-07-10T06:12:27.414342Z debug server/session_handler.go:1448 onInvokeWithoutUpdatesExt - request data: {sess: {user_id: 10000028, auth_key_id: 5911271281717399216, session_id: -3802101505398063149, state: 1, conn_id_list: [{1 1594361547}]}, conn_id: %!s(int32=1), msg_id: 6847730703308426240, seq_no: 1, request: {*server.TLInvokeWithoutUpdatesExt}}
2020-07-10T06:12:27.414357Z debug server/session_handler.go:1481 onRpcRequest - request data: {sess: {user_id: 10000028, auth_key_id: 5911271281717399216, session_id: -3802101505398063149, state: 1, conn_id_list: [{1 1594361547}]}, conn_id: %!s(int32=1), msg_id: 6847730703308426240, seq_no: 1, request: {*mtproto.TLUploadGetFile}}
2020-07-10T06:12:27.414375Z info server/session_handler.go:1570 genericSession]]>> - &{1594361547 0 0xc07a367500 1 0 0 }
2020-07-10T06:12:27.414389Z debug server/session_handler.go:1448 onInvokeWithoutUpdatesExt - request data: {sess: {user_id: 10000028, auth_key_id: 5911271281717399216, session_id: -3802101505398063149, state: 1, conn_id_list: [{1 1594361547}]}, conn_id: %!s(int32=1), msg_id: 6847730703309220864, seq_no: 3, request: {*server.TLInvokeWithoutUpdatesExt}}
2020-07-10T06:12:27.414401Z debug server/session_handler.go:1481 onRpcRequest - request data: {sess: {user_id: 10000028, auth_key_id: 5911271281717399216, session_id: -3802101505398063149, state: 1, conn_id_list: [{1 1594361547}]}, conn_id: %!s(int32=1), msg_id: 6847730703309220864, seq_no: 3, request: {*mtproto.TLUploadGetFile}}
2020-07-10T06:12:27.414413Z info server/session_handler.go:1570 genericSession]]>> - &{1594361547 0 0xc07a367580 1 0 0 }2020-07-10T06:12:27.414493Z debug server/session_server.go:203 sendDataToGate - map[/nebulaim/egate/node1:0xc00009cfc0]
2020-07-10T06:12:27.415572Z debug server/auth_sessions.go:324 apiRequests - #&{1 -3802101505398063149 [0xc008f6f640 0xc008f6f740]}
fatal error: concurrent map read and map write

goroutine 293178 [running]:
runtime.throw(0x2093c76, 0x21)
/usr/local/Cellar/go/1.14.3/libexec/src/runtime/panic.go:1116 +0x72 fp=0xc001addef0 sp=0xc001addec0 pc=0x434752
runtime.mapaccess2_fast64(0x1dc4b60, 0xc009809830, 0xcb3c38d342c53fd3, 0xc009d18d20, 0x44)
/usr/local/Cellar/go/1.14.3/libexec/src/runtime/map_fast64.go:61 +0x1a5 fp=0xc001addf18 sp=0xc001addef0 pc=0x4119c5
nebula.chat/enterprise/session/server.(*authSessions).onRpcRequest(0xc009435380, 0xc01a5e90e0)
/data/go/src/im-service/enterprise/session/server/auth_sessions.go:515 +0x4d fp=0xc001addf88 sp=0xc001addf18 pc=0x1c80d5d
nebula.chat/enterprise/session/server.(*authSessions).rpcRunLoop(0xc009435380)
/data/go/src/im-service/enterprise/session/server/auth_sessions.go:325 +0x87 fp=0xc001addfd8 sp=0xc001addf88 pc=0x1c7f897runtime.goexit()
/usr/local/Cellar/go/1.14.3/libexec/src/runtime/asm_amd64.s:1373 +0x1 fp=0xc001addfe0 sp=0xc001addfd8 pc=0x466d81
created by nebula.chat/enterprise/session/server.(*authSessions).Start
/data/go/src/im-service/enterprise/session/server/auth_sessions.go:266 +0x62

goroutine 1 [chan receive, 1148 minutes]:
nebula.chat/enterprise/pkg/commands.runServer(0x20647d0, 0xc, 0xc0000885a0, 0x0, 0x0)
/data/go/src/im-service/enterprise/pkg/commands/root.go:98 +0x2a4
nebula.chat/enterprise/pkg/commands.ServerCmdF(0x3385f60, 0x33cb438, 0x0, 0x0) /data/go/src/im-service/enterprise/pkg/commands/root.go:74 +0xba
github.com/spf13/cobra.(*Command).execute(0x3385f60, 0xc0001841b0, 0x0, 0x0, 0x3385f60, 0xc0001841b0)
/data/go/pkg/mod/github.com/spf13/[email protected]/command.go:830 +0x29d
github.com/spf13/cobra.(*Command).ExecuteC(0x3385f60, 0x162010b4f6fd85f7, 0x339f2a0, 0x5)
/data/go/pkg/mod/github.com/spf13/[email protected]/command.go:914 +0x2fb
github.com/spf13/cobra.(*Command).Execute(...)
/data/go/pkg/mod/github.com/spf13/[email protected]/command.go:864
nebula.chat/enterprise/pkg/commands.Run(0xc0001841b0, 0x0, 0x0, 0x23344c0, 0xc0000a2000, 0xc000182058, 0x0)
/data/go/src/im-service/enterprise/pkg/commands/root.go:57 +0x333
......

Telegram FOSS integration

Trying to get

git clone --recursive https://github.com/Telegram-FOSS-Team/Telegram-FOSS.git
cd Telegram-FOSS/
git checkout a655fde98089ad949f1d131cb6d627d70bb5159e

Got:
no found

What version of telegram-foss did you use?

What version of telegram-android works for you?

build error on linux

root@famous-hops-6:~/chatengine# make
docker build -t chatengine/server:latest .
Sending build context to Docker daemon  48.55MB
Step 1/32 : FROM golang:1.12.12 AS builder
 ---> 25f57e47afce
Step 2/32 : ENV CGO_ENABLED 0
 ---> Using cache
 ---> f0d0a1d2204d
Step 3/32 : ENV TARGET_DIR $GOPATH/src/github.com/nebula-chat/chatengine
 ---> Using cache
 ---> 09946a47c98a
Step 4/32 : RUN echo $GOPATH
 ---> Using cache
 ---> 0d750830c072
Step 5/32 : RUN mkdir -p $TARGET_DIR
 ---> Using cache
 ---> a01a63e5a362
Step 6/32 : RUN cd $TARGET_DIR
 ---> Using cache
 ---> 5a27f4851aa8
Step 7/32 : COPY . $TARGET_DIR/
 ---> fd6171417257
Step 8/32 : RUN cd ${TARGET_DIR}/messenger/biz_server && go build -ldflags='-s -w'
 ---> Running in 9374c9bf23b1
messenger.go:23:2: cannot find package "github.com/BurntSushi/toml" in any of:
        /usr/local/go/src/github.com/BurntSushi/toml (from $GOROOT)
        /go/src/github.com/BurntSushi/toml (from $GOPATH)
../../service/idgen/client/snowflake_idgen.go:22:2: cannot find package "github.com/bwmarrin/snowflake" in any of:
        /usr/local/go/src/github.com/bwmarrin/snowflake (from $GOROOT)
        /go/src/github.com/bwmarrin/snowflake (from $GOPATH)
../../pkg/mysql_client/mysql_config.go:21:2: cannot find package "github.com/go-sql-driver/mysql" in any of:
        /usr/local/go/src/github.com/go-sql-driver/mysql (from $GOROOT)
        /go/src/github.com/go-sql-driver/mysql (from $GOPATH)
../../pkg/grpc_util/rpc_util.go:22:2: cannot find package "github.com/gogo/protobuf/proto" in any of:
        /usr/local/go/src/github.com/gogo/protobuf/proto (from $GOROOT)
        /go/src/github.com/gogo/protobuf/proto (from $GOPATH)
messenger.go:24:2: cannot find package "github.com/golang/glog" in any of:
        /usr/local/go/src/github.com/golang/glog (from $GOROOT)
        /go/src/github.com/golang/glog (from $GOPATH)
../../mtproto/nbfs_service.pb.go:6:8: cannot find package "github.com/golang/protobuf/proto" in any of:
        /usr/local/go/src/github.com/golang/protobuf/proto (from $GOROOT)
        /go/src/github.com/golang/protobuf/proto (from $GOPATH)
../../pkg/grpc_util/rpc_metadata.pb.go:18:8: cannot find package "github.com/golang/protobuf/ptypes/any" in any of:
        /usr/local/go/src/github.com/golang/protobuf/ptypes/any (from $GOROOT)
        /go/src/github.com/golang/protobuf/ptypes/any (from $GOPATH)
../../service/idgen/client/redis_seq_client.go:24:2: cannot find package "github.com/gomodule/redigo/redis" in any of:
        /usr/local/go/src/github.com/gomodule/redigo/redis (from $GOROOT)
        /go/src/github.com/gomodule/redigo/redis (from $GOPATH)
../../pkg/grpc_util/middleware/recovery2/grpc_recovery_server.go:21:2: cannot find package "github.com/grpc-ecosystem/go-grpc-middleware" in any of:
        /usr/local/go/src/github.com/grpc-ecosystem/go-grpc-middleware (from $GOROOT)
        /go/src/github.com/grpc-ecosystem/go-grpc-middleware (from $GOPATH)
../../pkg/grpc_util/rpc_auth_handler.go:22:2: cannot find package "github.com/grpc-ecosystem/go-grpc-middleware/auth" in any of:
        /usr/local/go/src/github.com/grpc-ecosystem/go-grpc-middleware/auth (from $GOROOT)
        /go/src/github.com/grpc-ecosystem/go-grpc-middleware/auth (from $GOPATH)
../../pkg/grpc_util/rpc_auth_handler.go:23:2: cannot find package "github.com/grpc-ecosystem/go-grpc-middleware/tags" in any of:
        /usr/local/go/src/github.com/grpc-ecosystem/go-grpc-middleware/tags (from $GOROOT)
        /go/src/github.com/grpc-ecosystem/go-grpc-middleware/tags (from $GOPATH)
../../pkg/grpc_util/rpc_error_codec.go:25:2: cannot find package "github.com/grpc-ecosystem/go-grpc-middleware/util/metautils" in any of:
        /usr/local/go/src/github.com/grpc-ecosystem/go-grpc-middleware/util/metautils (from $GOROOT)
        /go/src/github.com/grpc-ecosystem/go-grpc-middleware/util/metautils (from $GOPATH)
biz/dal/dao/dao_manager.go:24:2: cannot find package "github.com/jmoiron/sqlx" in any of:
        /usr/local/go/src/github.com/jmoiron/sqlx (from $GOROOT)
        /go/src/github.com/jmoiron/sqlx (from $GOPATH)
biz/base/phone_number_util.go:23:2: cannot find package "github.com/nyaruka/phonenumbers" in any of:
        /usr/local/go/src/github.com/nyaruka/phonenumbers (from $GOROOT)
        /go/src/github.com/nyaruka/phonenumbers (from $GOPATH)
../../pkg/grpc_util/service_discovery/etcd3/registry.go:24:2: cannot find package "go.etcd.io/etcd/clientv3" in any of:
        /usr/local/go/src/go.etcd.io/etcd/clientv3 (from $GOROOT)
        /go/src/go.etcd.io/etcd/clientv3 (from $GOPATH)
../../pkg/grpc_util/service_discovery/etcd3/watcher.go:24:2: cannot find package "go.etcd.io/etcd/mvcc/mvccpb" in any of:
        /usr/local/go/src/go.etcd.io/etcd/mvcc/mvccpb (from $GOROOT)
        /go/src/go.etcd.io/etcd/mvcc/mvccpb (from $GOPATH)
../../pkg/grpc_util/service_discovery/etcd3/registry.go:25:2: cannot find package "golang.org/x/net/context" in any of:
        /usr/local/go/src/golang.org/x/net/context (from $GOROOT)
        /go/src/golang.org/x/net/context (from $GOPATH)
../../mtproto/nbfs_service.pb.go:12:2: cannot find package "google.golang.org/grpc" in any of:
        /usr/local/go/src/google.golang.org/grpc (from $GOROOT)
        /go/src/google.golang.org/grpc (from $GOPATH)
../../mtproto/rpc_error_codes_util.go:23:2: cannot find package "google.golang.org/grpc/codes" in any of:
        /usr/local/go/src/google.golang.org/grpc/codes (from $GOROOT)
        /go/src/google.golang.org/grpc/codes (from $GOPATH)
../../pkg/grpc_util/service_discovery/etcd3/registry.go:26:2: cannot find package "google.golang.org/grpc/grpclog" in any of:
        /usr/local/go/src/google.golang.org/grpc/grpclog (from $GOROOT)
        /go/src/google.golang.org/grpc/grpclog (from $GOPATH)
../../pkg/grpc_util/rpc_client.go:31:2: cannot find package "google.golang.org/grpc/metadata" in any of:
        /usr/local/go/src/google.golang.org/grpc/metadata (from $GOROOT)
        /go/src/google.golang.org/grpc/metadata (from $GOPATH)
../../pkg/grpc_util/service_discovery/etcd3/resolver.go:24:2: cannot find package "google.golang.org/grpc/naming" in any of:
        /usr/local/go/src/google.golang.org/grpc/naming (from $GOROOT)
        /go/src/google.golang.org/grpc/naming (from $GOPATH)
../../mtproto/rpc_error_codes_util.go:24:2: cannot find package "google.golang.org/grpc/status" in any of:
        /usr/local/go/src/google.golang.org/grpc/status (from $GOROOT)
        /go/src/google.golang.org/grpc/status (from $GOPATH)
The command '/bin/sh -c cd ${TARGET_DIR}/messenger/biz_server && go build -ldflags='-s -w'' returned a non-zero code: 1
Makefile:4: recipe for target 'build' failed
make: *** [build] Error 1

Can't find name: TLInvokeWithLayer

2020-05-11T01:49:50.162820Z error server/session_handler.go:1684 rpc error: code = 500 desc = INTERNAL: INTERNAL_SERVER_ERROR
2020-05-11T01:49:50.273698Z error mtproto/rpc_client_registers.go:352 Can't find name: TLInvokeWithLayer
2020-05-11T01:49:50.273833Z error grpc_util/rpc_client.go:107 Invoke error: constructor:CRC32_invokeWithLayer layer:97 query:"\270\210Qx\002\000\000\000\252\333\000\000\riPhone 7 Plus\000\000\00611.4.1\000\0071.0 (2)\nzh-Hans-HK\000\003ios\rclassic-zh-cn\000\000\235\324\301\231\025\304\265\034\003\000\000\000\331\033\336\300\010bundleId\000\000\000zv\036\267\nRG293LY76W\000\331\033\336\300\004data\000\000\000zv\036\267\034pvmTaKMP6WzT3WE+xKp2wb2RWyE=\000\000\000\331\033\336\300\004name\000\000\000zv\036\267^/UID=MJ69F8KJJW/CN=Apple Development: xxxxx (xxxxxxxxx)/OU=RG293LY76W/O=VPNGo Inc./C=US\000k\030\371\304" not regist!

Pulling chatengine (chatengine/server:latest)... 报错

[root@localhost chatengine]# docker-compose -f ./docker-compose.yml up -d
Pulling chatengine (chatengine/server:latest)...
ERROR: The image for the service you're trying to recreate has been removed. If you continue, volume data could be lost. Consider backing up your data before continuing.

go build的时候报错哦

frontend.go:22:2: cannot find package "github.com/nebula-chat/chatengine/access/frontend/server" in any of:
/usr/lib/golang/src/github.com/nebula-chat/chatengine/access/frontend/server (from $GOROOT)
/root/go/src/github.com/nebula-chat/chatengine/access/frontend/server (from $GOPATH)
frontend.go:23:2: cannot find package "github.com/nebula-chat/chatengine/pkg/util" in any of:
/usr/lib/golang/src/github.com/nebula-chat/chatengine/pkg/util (from $GOROOT)
/root/go/src/github.com/nebula-chat/chatengine/pkg/util (from $GOPATH)

go build

I follow this https://github.com/nebula-chat/chatengine

cd $GOPATH/src/github.com/nebula-chat/chatengine/access/frontend
go build

got

root@indachat:/var/workspace/src/github.com/nebula-chat/chatengine/access/frontend# go build
../../vendor/github.com/coreos/etcd/clientv3/auth.go:18:2: cannot find package "context" in any of:
        /var/workspace/src/github.com/nebula-chat/chatengine/vendor/github.com/coreos/etcd/vendor/context (vendor tree)
        /var/workspace/src/github.com/nebula-chat/chatengine/vendor/context
        /usr/lib/go-1.6/src/context (from $GOROOT)
        /var/workspace/src/context (from $GOPATH)

Could you help please to build the solution?

messenger.go build error

I got error when build messenger/biz_server

.\messenger.go:134:35: cannot use s2 (type *"google.golang.org/grpc".Server) as type *"github.com/nebula-chat/chatengine/vendor/google.golang.org/grpc".Server in argument to mtproto.RegisterRPCAccountServer .\messenger.go:135:32: cannot use s2 (type *"google.golang.org/grpc".Server) as type *"github.com/nebula-chat/chatengine/vendor/google.golang.org/grpc".Server in argument to mtproto.RegisterRPCAuthServer .\messenger.go:136:32: cannot use s2 (type *"google.golang.org/grpc".Server) as type *"github.com/nebula-chat/chatengine/vendor/google.golang.org/grpc".Server in argument to mtproto.RegisterRPCBotsServer .\messenger.go:137:36: cannot use s2 (type *"google.golang.org/grpc".Server) as type *"github.com/nebula-chat/chatengine/vendor/google.golang.org/grpc".Server in argument to mtproto.RegisterRPCChannelsServer .\messenger.go:138:36: cannot use s2 (type *"google.golang.org/grpc".Server) as type *"github.com/nebula-chat/chatengine/vendor/google.golang.org/grpc".Server in argument to mtproto.RegisterRPCContactsServer .\messenger.go:139:32: cannot use s2 (type *"google.golang.org/grpc".Server) as type *"github.com/nebula-chat/chatengine/vendor/google.golang.org/grpc".Server in argument to mtproto.RegisterRPCHelpServer .\messenger.go:140:36: cannot use s2 (type *"google.golang.org/grpc".Server) as type *"github.com/nebula-chat/chatengine/vendor/google.golang.org/grpc".Server in argument to mtproto.RegisterRPCLangpackServer .\messenger.go:141:36: cannot use s2 (type *"google.golang.org/grpc".Server) as type *"github.com/nebula-chat/chatengine/vendor/google.golang.org/grpc".Server in argument to mtproto.RegisterRPCMessagesServer .\messenger.go:142:36: cannot use s2 (type *"google.golang.org/grpc".Server) as type *"github.com/nebula-chat/chatengine/vendor/google.golang.org/grpc".Server in argument to mtproto.RegisterRPCPaymentsServer .\messenger.go:143:33: cannot use s2 (type *"google.golang.org/grpc".Server) as type *"github.com/nebula-chat/chatengine/vendor/google.golang.org/grpc".Server in argument to mtproto.RegisterRPCPhoneServer .\messenger.go:143:33: too many errors

NewZRpcClient()函数是不是有个参数错了?

func NewZRpcClient(protoName string, conf *ZRpcClientConfig, cb ZPpcClientCallBack) *ZRpcClient {
clients := map[string][]string{}

c := &ZRpcClient{
	callback: cb,
}

c.clients = net2.NewTcpClientGroupManager(protoName, clients, c)

// 这里的clients传进去是个新建没有数据的映射表,是不是写错了?

./biz_server cannot start

Everything compiled correctly.

All these services started correctly

cd $GOPATH/src/github.comnebula-chat/chatengine/service/auth_session
./auth_session

cd $GOPATH/src/github.com/nebula-chat/chatengine/service/document
./document

cd $GOPATH/src/github.com/nebula-chat/chatengine/messenger/sync
./sync

cd $GOPATH/src/github.com/nebula-chat/chatengine/messenger/upload
./upload

Problem with

root@testchat:/var/workspace/src/github.com/nebula-chat/chatengine/messenger/biz_server# ./biz_server

Got

I0625 18:25:08.731736    8181 app_instance.go:60] instance initialize...
I0625 18:25:08.731997    8181 messenger.go:99] messengerServer - initialize...
I0625 18:25:08.732791    8181 messenger.go:106] messengerServer - load conf: &{1 127.0.0.1 0xc000118540 [{immaster root:@tcp(127.0.0.1:3306)/chatengine?charset=utf8mb4 5 2} {imslave root:@tcp(127.0.0.1:3306)/chatengine?charset=utf8mb4 5 2}] [{cache 127.0.0.1:6379 100 100 1000000000 1000000000 1000000000 10000000000 0 }] 0xc000096c40 0xc000096a80 0xc000096b40 0xc000096bc0}
E0625 18:25:13.738853    8181 rpc_client.go:78] context deadline exceeded
panic: context deadline exceeded

goroutine 1 [running]:
github.com/nebula-chat/chatengine/pkg/grpc_util.NewRPCClientByServiceDiscovery(0xc000096c40, 0x12e47e0, 0xc000155d10, 0xc0000b4ea0)
        /var/workspace/src/github.com/nebula-chat/chatengine/pkg/grpc_util/rpc_client.go:79 +0x904
github.com/nebula-chat/chatengine/service/document/client.InstallNbfsClient(0xc000096c40)
        /var/workspace/src/github.com/nebula-chat/chatengine/service/document/client/rpc_document_client.go:37 +0x2f
main.(*messengerServer).Initialize.func1()
        /var/workspace/src/github.com/nebula-chat/chatengine/messenger/biz_server/messenger.go:117 +0xef
github.com/nebula-chat/chatengine/messenger/biz_server/biz/core.InstallCoreModels(0x1, 0x157fce0, 0x1, 0xc000655e00, 0x2)
        /var/workspace/src/github.com/nebula-chat/chatengine/messenger/biz_server/biz/core/core.go:63 +0x248
main.(*messengerServer).Initialize(0xc000aec6e0, 0x15ab915900000000, 0x1)
        /var/workspace/src/github.com/nebula-chat/chatengine/messenger/biz_server/messenger.go:108 +0x1a7
github.com/nebula-chat/chatengine/pkg/util.DoMainAppInstance(0x18077c0, 0xc000aec6e0)
        /var/workspace/src/github.com/nebula-chat/chatengine/pkg/util/app_instance.go:61 +0x175
main.main()
        /var/workspace/src/github.com/nebula-chat/chatengine/messenger/biz_server/messenger.go:163 +0x95
root@indachat:/var/workspace/src/github.com/nebula-chat/chatengine/messenger/biz_server# 

from this issue, I have to launch the load balancer ... https://github.com/liyue201/grpc-lb
did not find documentation for dev setup for load balance

Tthe starting process of access/auth_key , access/session has the same problem

frontend started normally

Failed to compile the gateway under windows

Although it’s not related to your code ( gnet issue ), But i can’t compile the gateway under windows , plz any suggestion(s) on that ?
here’s the compile error :
# github.com/panjf2000/gnet ..\..\..\..\..\..\..\..\pkg\mod\github.com\teamgram\[email protected]\connection_windows.go:140:2: cannot use interface {} value as type []byte in return argument: need type assertion ..\..\..\..\..\..\..\..\pkg\mod\github.com\teamgram\[email protected]\connection_windows.go:140:23: cannot use c (type *stdConn) as type Conn in argument to c.codec.Decode: *stdConn does not implement Conn (missing ConnID method) ..\..\..\..\..\..\..\..\pkg\mod\github.com\teamgram\[email protected]\connection_windows.go:146:36: cannot use c (type *stdConn) as type Conn in argument to c.codec.Encode: *stdConn does not implement Conn (missing ConnID method) ..\..\..\..\..\..\..\..\pkg\mod\github.com\teamgram\[email protected]\connection_windows.go:149:31: cannot use c (type *stdConn) as type Conn in argument to c.loop.eventHandler.PreWrite: *stdConn does not implement Conn (missing ConnID method) ..\..\..\..\..\..\..\..\pkg\mod\github.com\teamgram\[email protected]\connection_windows.go:151:33: cannot use c (type *stdConn) as type Conn in argument to c.loop.eventHandler.AfterWrite: *stdConn does not implement Conn (missing ConnID method) ..\..\..\..\..\..\..\..\pkg\mod\github.com\teamgram\[email protected]\connection_windows.go:254:30: cannot use c (type *stdConn) as type Conn in argument to c.loop.eventHandler.PreWrite: *stdConn does not implement Conn (missing ConnID method) ..\..\..\..\..\..\..\..\pkg\mod\github.com\teamgram\[email protected]\connection_windows.go:256:32: cannot use c (type *stdConn) as type Conn in argument to c.loop.eventHandler.AfterWrite: *stdConn does not implement Conn (missing ConnID method) ..\..\..\..\..\..\..\..\pkg\mod\github.com\teamgram\[email protected]\eventloop_windows.go:102:41: cannot use c (type *stdConn) as type Conn in argument to el.eventHandler.OnOpened: *stdConn does not implement Conn (missing ConnID method) ..\..\..\..\..\..\..\..\pkg\mod\github.com\teamgram\[email protected]\eventloop_windows.go:104:27: cannot use c (type *stdConn) as type Conn in argument to el.eventHandler.PreWrite: *stdConn does not implement Conn (missing ConnID method) ..\..\..\..\..\..\..\..\pkg\mod\github.com\teamgram\[email protected]\eventloop_windows.go:106:29: cannot use c (type *stdConn) as type Conn in argument to el.eventHandler.AfterWrite: *stdConn does not implement Conn (missing ConnID method) ..\..\..\..\..\..\..\..\pkg\mod\github.com\teamgram\[email protected]\eventloop_windows.go:106:29: too many errors

无法彻底删除群组记录

  • 平台: Android
  • 重现几率: 100%
  • 重现方法: 新建一个群组,随便拉一个人,把刚拉的人踢出群,再点删除并退出群组,注销账号,再次登录,已删除并退出的群组再次出现聊天列表,显示人数为0,而且可以在里面发消息,重复以上步骤将会复现该问题。
  • 已注销的联系人也有同样的问题,删除后也会再次出现。

Unable to authenticate

Today installed chatengine

client: nekohasekai/NebulaAndroid/actions

Jun 16 18:13:42 telegram-server FRONTEND[2134]: I0616 18:13:42.931123 2134 server.go:223] onClientHandshakeMessage - handshake: peer({connID: 2@handshake-(127.0.0.1:56490->127.0.0.1:10005)}), state: {data2:<state:514 res_state:1 client_conn_id:6 ctx:<constructor:CRC32_handshakeContext data2:<nonce:"\367\302C\017\035tU\273\004\234\335\301@\357\353\252" server_nonce:"\021E\251\273\357\212\221\334\247\014\371\342e\375\322j" > > > }
Jun 16 18:13:42 telegram-server FRONTEND[2134]: I0616 18:13:42.931190 2134 server.go:245] onClientHandshakeMessage - sendToClient to: {peer: {connID: 6@frontend443-(10.72.150.55:12443->10.72.111.204:36220)}, handshake: data2:<state:514 res_state:1 client_conn_id:6 ctx:<constructor:CRC32_handshakeContext data2:<nonce:"\367\302C\017\035tU\273\004\234\335\301@\357\353\252" server_nonce:"\021E\251\273\357\212\221\334\247\014\371\342e\375\322j" > > > }
Jun 16 18:13:42 telegram-server FRONTEND[2134]: I0616 18:13:42.932585 2134 mtproto_app_codec.go:63] size1: 40
Jun 16 18:13:42 telegram-server FRONTEND[2134]: I0616 18:13:42.932616 2134 mtproto_server.go:102] onConnectionDataArrived 10.72.111.204:36220
Jun 16 18:13:42 telegram-server FRONTEND[2134]: I0616 18:13:42.932630 2134 server.go:145] onServerMessageDataArrived - receive data: {peer: {connID: 6@frontend443-(10.72.150.55:12443->10.72.111.204:36220)}, md: {conn_type: 1, auth_key_id: 0, quick_ack_id: 0, payload_len: 40}, msg: %!s(MISSING)}

image

客户端编译

android客户端编译老是不成功,有点蛋疼。
想通过客户端操作,然后学习服务端源码的。作者大大可以给份能编译通过的吗?

运行一段时间之后回报错

BizUnaryRecoveryHandler - goroutine 131366 [running]:
runtime/debug.Stack(0x7f1815167fa8, 0xc001163080, 0xc001b3e2f0)
/usr/lib/golang/src/runtime/debug/stack.go:24 +0xa7
github.com/nebula-chat/chatengine/pkg/grpc_util.BizUnaryRecoveryHandler(0x1701dc0, 0xc001d66420, 0x12498e0, 0xc00154fa50, 0x4c97db, 0xc00142c02d)
/root/src/github.com/nebula-chat/chatengine/pkg/grpc_util/rpc_recovery_handler.go:32 +0x34
github.com/nebula-chat/chatengine/pkg/grpc_util/middleware/recovery2.unaryRecoverFrom(0x1701dc0, 0xc001d66420, 0x12498e0, 0xc00154fa50, 0x151ee40, 0xc000f61780, 0x40d47f)
/root/src/github.com/nebula-chat/chatengine/pkg/grpc_util/middleware/recovery2/interceptors.go:75 +0x5a
github.com/nebula-chat/chatengine/pkg/grpc_util/middleware/recovery2.UnaryServerInterceptor.func1.1(0x1701dc0, 0xc001d66420, 0xc0004bb2a0, 0xc000f61b38)
/root/src/github.com/nebula-chat/chatengine/pkg/grpc_util/middleware/recovery2/interceptors.go:42 +0x77
panic(0x12498e0, 0xc00154fa50)
/usr/lib/golang/src/runtime/panic.go:513 +0x1b9
github.com/nebula-chat/chatengine/messenger/biz_server/biz/base.(*PeerUtil).ToPeer(0xc0013df000, 0xc00116fae0)
/root/src/github.com/nebula-chat/chatengine/messenger/biz_server/biz/base/peer_util.go:194 +0x3ca
github.com/nebula-chat/chatengine/messenger/biz_server/server/messages.(*MessagesServiceImpl).MessagesGetPeerDialogs(0xc000010d80, 0x1701dc0, 0xc001d66420, 0xc00132e540, 0xc000010d80, 0x20, 0xc000f61a98)
/root/src/github.com/nebula-chat/chatengine/messenger/biz_server/server/messages/messages.getPeerDialogs_handler.go:68 +0x9f2
github.com/nebula-chat/chatengine/mtproto._RPCMessages_MessagesGetPeerDialogs_Handler.func1(0x1701dc0, 0xc001d66420, 0x13dc5c0, 0xc00132e540, 0xc0004bb2a0, 0xc000f61b38, 0x40dd08, 0x20)
/root/src/github.com/nebula-chat/chatengine/mtproto/schema.tl.sync_service.pb.go:18163 +0x89
github.com/nebula-chat/chatengine/pkg/grpc_util/middleware/recovery2.UnaryServerInterceptor.func1(0x1701dc0, 0xc001d66420, 0x13dc5c0, 0xc00132e540, 0xc001c3b460, 0xc001c3b480, 0x0, 0x0, 0x0, 0x0)
/root/src/github.com/nebula-chat/chatengine/pkg/grpc_util/middleware/recovery2/interceptors.go:47 +0xba
github.com/nebula-chat/chatengine/mtproto._RPCMessages_MessagesGetPeerDialogs_Handler(0x14b8260, 0xc000010d80, 0x1701dc0, 0xc001d66420, 0xc0010c20e0, 0xc000560db0, 0x0, 0x0, 0x0, 0x0)
/root/src/github.com/nebula-chat/chatengine/mtproto/schema.tl.sync_service.pb.go:18165 +0x158
github.com/nebula-chat/chatengine/vendor/google.golang.org/grpc.(*Server).processUnaryRPC(0xc00041a900, 0x1716a60, 0xc00043de00, 0xc0013e9300, 0xc000529950, 0x25c0b78, 0x0, 0x0, 0x0)
/root/src/github.com/nebula-chat/chatengine/vendor/google.golang.org/grpc/server.go:982 +0x4cd
github.com/nebula-chat/chatengine/vendor/google.golang.org/grpc.(*Server).handleStream(0xc00041a900, 0x1716a60, 0xc00043de00, 0xc0013e9300, 0x0)
/root/src/github.com/nebula-chat/chatengine/vendor/google.golang.org/grpc/server.go:1208 +0x1311
github.com/nebula-chat/chatengine/vendor/google.golang.org/grpc.(*Server).serveStreams.func1.1(0xc0004820f0, 0xc00041a900, 0x1716a60, 0xc00043de00, 0xc0013e9300)
/root/src/github.com/nebula-chat/chatengine/vendor/google.golang.org/grpc/server.go:686 +0x9f
created by github.com/nebula-chat/chatengine/vendor/google.golang.org/grpc.(*Server).serveStreams.func1
/root/src/github.com/nebula-chat/chatengine/vendor/google.golang.org/grpc/server.go:684 +0xa1
E0710 11:36:05.589755 2653 rpc_recovery_handler.go:44] Panic: rpc error: code = Unknown desc = panic unknown triggered: ToPeer(PEER_SELF: {peer_id: 0, access_hash: 0) error!

mysql cpu负载过高(2)

QUERY:
select user_id, user_message_box_id, dialog_id, dialog_message_id, message_data_id, pts, message_box_type, reply_to_msg_id, mentioned, media_unread, date2 from message_boxes where message_data_id = ? and deleted = 0 limit 1

SUGGESTION:
add index

启动异常

  1. 启动sync报错,看提示 是找不到文件,sync目录后面多了个点
    image

    2.启动biz_server启动异常
    image
    docker 正常运行
    image

Question: Update API Layer

is there a guide how to update the API Layer?
I got the schema for layer 121 and I can convert it based on the implementation however I don't know how to generate the file descriptor is there any guide for it?

测试服务的版本和master分支的代码是否相同

按照补丁编译好的telegram,选择目前的测试服(47.103.102.219)可以登录并正常收发单聊消息的,但按照docker代码进行构建并使用客户端连接私服,客户端无法正常收发消息,且切换到别的应用程序再切回来后会一直显示Connecting...
服务端异常日志:

E0517 20:18:00.226402      85 rpc_recovery_handler.go:32] BizUnaryRecoveryHandler - goroutine 134 [running]:
runtime/debug.Stack(0x406234, 0xc0001d2a80, 0xc000b05300)
 /usr/local/go/src/runtime/debug/stack.go:24 +0x9d
github.com/nebula-chat/chatengine/pkg/grpc_util.BizUnaryRecoveryHandler(0x18870e0, 0xc00084c870, 0x12e0a20, 0x2769c10, 0x46ec3e, 0xc000d7c0b0)
 /go/src/github.com/nebula-chat/chatengine/pkg/grpc_util/rpc_recovery_handler.go:32 +0x34
github.com/nebula-chat/chatengine/pkg/grpc_util/middleware/recovery2.unaryRecoverFrom(0x18870e0, 0xc00084c870, 0x12e0a20, 0x2769c10, 0x1688358, 0xc000b053b0, 0x0)
 /go/src/github.com/nebula-chat/chatengine/pkg/grpc_util/middleware/recovery2/interceptors.go:75 +0x5a
github.com/nebula-chat/chatengine/pkg/grpc_util/middleware/recovery2.UnaryServerInterceptor.func1.1(0x18870e0, 0xc00084c870, 0xc000c490a0, 0xc000b05ad8)
 /go/src/github.com/nebula-chat/chatengine/pkg/grpc_util/middleware/recovery2/interceptors.go:42 +0x77
panic(0x12e0a20, 0x2769c10)
 /usr/local/go/src/runtime/panic.go:522 +0x1b5
github.com/nebula-chat/chatengine/messenger/biz_server/biz/core/message.(*MessageModel).SendMessage.func2(0xc000000006, 0xc0002955e0)
 /go/src/github.com/nebula-chat/chatengine/messenger/biz_server/biz/core/message/message_helper.go:559 +0x1d2
github.com/nebula-chat/chatengine/messenger/biz_server/biz/core/message.(*MessageModel).SendInternalMessage(0xc000913da0, 0x6, 0xc000ac02c0, 0x912b636b948557a7, 0xc0000d8900, 0xc00084cba0, 0xc000b05780, 0xbc, 0x203000)
 /go/src/github.com/nebula-chat/chatengine/messenger/biz_server/biz/core/message/message_helper.go:362 +0x698
github.com/nebula-chat/chatengine/messenger/biz_server/biz/core/message.(*MessageModel).SendMessage(0xc000913da0, 0xc000000006, 0xc000ac02c0, 0x912b636b948557a7, 0xc00084cba0, 0xc000b5c710, 0xc000cbadc0, 0xc000b5c720, 0x0, 0x0, ...)
 /go/src/github.com/nebula-chat/chatengine/messenger/biz_server/biz/core/message/message_helper.go:554 +0x110
github.com/nebula-chat/chatengine/messenger/biz_server/server/messages.(*MessagesSer

Client KeyFingerPrint don't match

on the client diffs the keyfingerprint is 0xa9e071c1771060cd(12240908862933197005) while on the server the value is 0x1CAC303C5826940B(2066079364791309323)

mysql cpu负载过高

CPU负载过高,经排查可能由以下语句造成
“select user_id, pts, pts_count, update_type, update_data from user_pts_updates where user_id = ? and pts > ? order by pts asc”
加上索引后可降低

运行时间长了,session服务会崩溃

goroutine 282555 [sync.Cond.Wait, 10 minutes]:
sync.runtime_notifyListWait(0xc000da42d0, 0x7)
/usr/lib/golang/src/runtime/sema.go:510 +0xeb
sync.(*Cond).Wait(0xc000da42c0)
/usr/lib/golang/src/sync/cond.go:56 +0x92
github.com/nebula-chat/chatengine/pkg/queue2.(*SyncQueue).Pop(0xc000f5f860, 0xc000cdf540, 0xc000cdf540)
/root/src/github.com/nebula-chat/chatengine/pkg/queue2/sync_queue.go:51 +0x5b
github.com/nebula-chat/chatengine/access/session/server.(*authSessions).rpcRunLoop(0xc000455cc0)
/root/src/github.com/nebula-chat/chatengine/access/session/server/auth_sessions.go:404 +0x48
created by github.com/nebula-chat/chatengine/access/session/server.(*authSessions).Start
/root/src/github.com/nebula-chat/chatengine/access/session/server/auth_sessions.go:356 +0x6c

goroutine 156673 [select]:
github.com/nebula-chat/chatengine/access/session/server.(*authSessions).runLoop(0xc001da4320)
/root/src/github.com/nebula-chat/chatengine/access/session/server/auth_sessions.go:373 +0x16f
created by github.com/nebula-chat/chatengine/access/session/server.(*authSessions).Start
/root/src/github.com/nebula-chat/chatengine/access/session/server/auth_sessions.go:357 +0x8e

goroutine 284740 [select]:
github.com/nebula-chat/chatengine/access/session/server.(*authSessions).runLoop(0xc000454a00)
/root/src/github.com/nebula-chat/chatengine/access/session/server/auth_sessions.go:373 +0x16f
created by github.com/nebula-chat/chatengine/access/session/server.(*authSessions).Start
/root/src/github.com/nebula-chat/chatengine/access/session/server/auth_sessions.go:357 +0x8e

How grpc server work with TLSchema client.

You used grpc and protobuff in server codes but for client you used the official telegram source codes which is using TLSchema serialization I wonder how this two can work together?
Can you help me?

关于无法连接服务器,frontend报错的问题查找

我是在windows10下运行,golang1.13.8

问题描述:

frontend在运行时,总是提示无法查找到key,经检查代码发现使用zrpcclient在创建etcd客户端时候莫名的错误,

估计与使用的库有关。

解决方案:多创建几次,保证成功:

// 所有的调用都使用"brpc"
func NewZRpcClient(protoName string, conf *ZRpcClientConfig, cb ZPpcClientCallBack) *ZRpcClient {
	clients := map[string][]string{}
 
	c := &ZRpcClient{
		callback: cb,
	}
 
	// 这里传进去一个空值,应该后面在watcher中addclient中添加到cgm中
	c.clients = net2.NewTcpClientGroupManager(protoName, clients, c)
 
	// Check name
	for i := 0; i < len(conf.Clients); i++ {
		// service discovery
		etcdConfg := clientv3.Config{
			Endpoints: conf.Clients[i].EtcdAddrs,
		}
		watcher := &Watcher{
			name: conf.Clients[i].Name,
		}
		// 这里的意思是本地的变量watcher里的 *watcher2.ClientWatcher类型watcher,
		watcher.watcher, _ = watcher2.NewClientWatcher("/nebulaim", conf.Clients[i].Name, etcdConfg, c.clients)
		// 之前这里的值在后面用的时候发现是空值
		k := 0
		for ; k <= 10 && watcher.watcher == nil;  {
			fmt.Printf("出现了问题,连接etcd错误, 重试第 %d 次 \n", k+2)
			time.Sleep(1 * time.Second)
			watcher.watcher, _ = watcher2.NewClientWatcher("/nebulaim", conf.Clients[i].Name, etcdConfg, c.clients)
			k = k + 1
		}
		if k >= 10 {
			fmt.Println("NewClientWatcher中etcd错误,试了10次还不行,重启一次试试")
		}
 
 
		if conf.Clients[i].Balancer == "ketama" {
			watcher.ketama = load_balancer.NewKetama(10, nil)
		} else {
			watcher.ketama = nil
		}
		c.watchers = append(c.watchers, watcher)
	}
 
	return c
}

Connect webogram to nebula-chat / nebulaim server ?

https://github.com/nebulaim/webogram client is easy to setup with nodejs 10 and can run, the server also easy to install and run. Can you tweak little bit webogram to be able to talk to server. It look like server can not recognized client. So , anyone interested in the project will jump in fast.

Here is some of the browser log:

[SW] on message {type: "notifications_clear"} push_worker.js:85 [SW] on message {type: "notifications_clear"} push_worker.js:85 [SW] on message {type: "notifications_clear"} J @ jquery.min.js:2 mtproto_wrapper.js:143 Get networker error {code: 406, type: "NETWORK_BAD_RESPONSE", url: "http://127.0.0.1:8800/apiw1", originalError: {…}} undefined mtproto_wrapper.js:143 Get networker error {code: 406, type: "NETWORK_BAD_RESPONSE", url: "http://127.0.0.1:8800/apiw1", originalError: {…}} undefined controllers.js:217 sendCode error {code: 406, type: "NETWORK_BAD_RESPONSE", url: "http://127.0.0.1:8800/apiw1", originalError: {…}, handled: true, …} services.js:4044 notify {title: "Telegram", message: "Your authorization key was successfully generated! Open the app to log in.", tag: "auth_key"} true true false angular.js:11881 POST http://127.0.0.1:8800/apiw1 net::ERR_EMPTY_RESPONSE (anonymous) @ angular.js:11881 sendReq @ angular.js:11642 push_worker.js:85 [SW] on message {type: "notifications_clear"}

=== Here is the server log:

I0730 14:52:59.923364 35437 server.go:301] onServerUnencryptedRawMessage - receive data: {peer: {connID: 3@frontend80-(127.0.0.1:8800->127.0.0.1:53683)}, ctx: &{{%!s(int32=0) %!s(uint32=0)} %!s(int=256) %!s(*mtproto.TLHandshakeData=&{0xc0006f9950 {} [] 0}) %!s(int64=0)}, msg: {conn_type: 2, auth_key_id: 0, quick_ack_id: 0, payload_len: 40}} I0730 14:52:59.924172 35437 server.go:340] sendMessage - handshake: {peer: {connID: 3@frontend80-(127.0.0.1:8800->127.0.0.1:53683)}, md: request:<service_name:"handshake" method_name:"mtproto.TLHandshakeData" log_id:1156110426904203264 trace_id:1156110426904203266 span_id:1156110426904203267 > correlation_id:1156110426904203265 attachment_size:40 mtproto_meta:<auth_key_id:0 server_id:1 client_conn_id:3 client_addr:"127.0.0.1:53683" from:"frontend" receive_time:1564473179924 > , msg: data2:<state:513 client_conn_id:3 ctx:<constructor:CRC32_handshakeContext data2:<> > > } I0730 14:52:59.927902 35401 server.go:95] onServerMessageDataArrived - msg: data2:<state:513 client_conn_id:3 ctx:<constructor:CRC32_handshakeContext data2:<> > > I0730 14:52:59.931836 35401 decode.go:320] newTLObjectByClassID, classID: 0x60469778 I0730 14:52:59.932478 35401 handshake.go:193] req_pq#60469778 - state: {data2:<state:513 client_conn_id:3 ctx:<constructor:CRC32_handshakeContext data2:<> > > }, request: {"nonce":"12ODQvHpDXvLMzzzbxRA8w=="} I0730 14:52:59.934285 35401 cache_state_manager.go:71] put state key: (salts_d7638342f1e90d7bcb333cf36f1440f3@a8fc3d2f1e509ae57c615f9258d16f3b) I0730 14:52:59.935936 35401 handshake.go:217] req_pq#60469778 - state: {data2:<state:513 client_conn_id:3 ctx:<constructor:CRC32_handshakeContext data2:<nonce:"\327c\203B\361\351\r{\3133<\363o\024@\363" server_nonce:"\250\374=/\036P\232\345|a_\222X\321o;" > > > }, reply: {"data2":{"nonce":"12ODQvHpDXvLMzzzbxRA8w==","server_nonce":"qPw9Lx5QmuV8YV+SWNFvOw==","pq":"\u0017\ufffdH\ufffd\u001a\u0008\ufffd\ufffd","server_public_key_fingerprints":[-6205835210776354611]}} I0730 14:52:59.937031 35437 server.go:223] onClientHandshakeMessage - handshake: peer({connID: 1@handshake-(127.0.0.1:53672->127.0.0.1:10005)}), state: {data2:<state:514 res_state:1 client_conn_id:3 ctx:<constructor:CRC32_handshakeContext data2:<nonce:"\327c\203B\361\351\r{\3133<\363o\024@\363" server_nonce:"\250\374=/\036P\232\345|a_\222X\321o;" > > > } W0730 14:52:59.938390 35437 server.go:230] conn closed, handshake: data2:<state:514 res_state:1 client_conn_id:3 ctx:<constructor:CRC32_handshakeContext data2:<nonce:"\327c\203B\361\351\r{\3133<\363o\024@\363" server_nonce:"\250\374=/\036P\232\345|a_\222X\321o;" > > > I0730 14:52:59.965383 35437 zrpc_client.go:229] sendPing: ping_id:6785692651167134319 I0730 14:52:59.965448 35437 zrpc_client.go:236] &{request:<service_name:"zrpc" method_name:"mtproto.TLPing" > [8 239 228 174 222 185 240 231 149 94] []} I0730 14:52:59.965518 35437 server.go:218] onClientTimer E0730 14:59:30.879001 35437 mtproto_http_proxy_codec.go:51] read tcp 127.0.0.1:8800->127.0.0.1:53728: i/o timeout I0730 14:59:30.879075 35437 mtproto_server.go:119] onConnectionClosed - 127.0.0.1:53728 I0730 14:59:30.879094 35437 server.go:176] onServerConnectionClosed - {peer: {connID: 4@frontend80-(127.0.0.1:8800->127.0.0.1:53728)}} E0730 14:59:30.879206 35437 tcp_server.go:192] conn {{connID: 4@frontend80-(127.0.0.1:8800->127.0.0.1:53728)}} recv error: read tcp 127.0.0.1:8800->127.0.0.1:53728: i/o timeout I0730 14:59:58.453999 35377 zrpc_client.go:229] sendPing: ping_id:1354515734731151583 I0730 14:59:58.454092 35377 zrpc_client.go:236] &{request:<service_name:"zrpc" method_name:"mtproto.TLPing" > [8 223 177 207 218 165 156 141 230 18] []} I0730 14:59:58.455277 35377 zrpc_client.go:188] recv pong: constructor:CRC32_pong data2:<ping_id:1354515734731151583 > I0730 15:00:00.033277 35437 zrpc_client.go:229] sendPing: ping_id:1679611750482287486 I0730 15:00:00.033372 35437 zrpc_client.go:236] &{request:<service_name:"zrpc" method_name:"mtproto.TLPing" > [8 254 182 198 160 207 191 203 167 23] []} I07

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.