raulmur / orb_slam Goto Github PK
View Code? Open in Web Editor NEWA Versatile and Accurate Monocular SLAM
Home Page: http://webdiis.unizar.es/~raulmur/orbslam/
License: Other
A Versatile and Accurate Monocular SLAM
Home Page: http://webdiis.unizar.es/~raulmur/orbslam/
License: Other
Did you do any subpixel refinement on the tracked features?
FAST does not give subpixel detection and descriptor matching also does not do subpixel refinement.
(Unless you did it somewhere in the code and I didn't find it)
Will this cause the odometry to drift more compared to other template-based tracking methods (KLT or ESM)
I am a freshman of ROS. I follow the usage step by step. But I really don know how to start camera node with my image sequences in the 4th step. I downloaded image sequences under a folder.
What should I do ? Thanks
Environment: ROS indigo, Ubuntu Trusty 14.04 on VM ware workstation, CPU i7 2.4GHz, 8G RAM
Dataset: KITTI\data_odometry_gray\dataset\sequences\00\image_0
Bag file: created with rosrun BagFromImages BagFromImages IMAGE_PATH 10.0 image_0.bag
settings.yaml:
%YAML:1.0
Camera.fx: 718.856
Camera.fy: 718.856
Camera.cx: 607.1928
Camera.cy: 185.2157
Camera.k1: 0
Camera.k2: 0
Camera.p1: 0.0
Camera.p2: 0.0
Camera.fps: 10.0
Camera.RGB: 1
ORBextractor.nFeatures: 2000
ORBextractor.scaleFactor: 1.2
ORBextractor.nLevels: 8
ORBextractor.fastTh: 20
ORBextractor.nScoreType: 1
UseMotionModel: 1
In execution, the tracking always failed and the relocalization module was invoked regardless of velocity motion model and nScoreType.
Do you have any idea why the result as shown on the ORB-SLAM project website cannot be reproduced?
Hi Raul,
thanks for your work.
Can you tell me which camera did you use and if there are some camera specs that improve the performance more than others?
Thanks
Mauro
Hi all, I'm trying to install ORB_SLAM but when I do rosrun ORB_SLAM ORB_SLAM...... I get this error: Erreur de segmentation (core dumped)
any help please, and thank you.
I was running ORB SLAM on my own data set , I got this error.
terminate called after throwing an instance of 'std::bad_alloc'
what(): std::bad_alloc
[ORB_SLAM-3] process has died [pid 7918, exit code -6, cmd /home/shabhu/Work/rosfuerte/ORB_SLAM/bin/ORB_SLAM Data/ORBvoc.yml Data/Settings.yaml __name:=ORB_SLAM __log:=/home/shabhu/.ros/log/28136f02-e8d9-11e4-bce3-18a905b9cd56/ORB_SLAM-3.log].
log file: /home/shabhu/.ros/log/28136f02-e8d9-11e4-bce3-18a905b9cd56/ORB_SLAM-3*.log
Can you help me out?
Hi raul,
Thanks for your contribution. I was wondering that why tf frames behave strangely on rviz? I have found that ORB-SLAM/Camera is fixed and world is moving, why? Sometimes I also noticed that ORB-SLAM/Camera is also rotating ! How come I convert it to real world coordinate? Does tf Transform give us camera pose information? Thanks in advance.
Hello, in the video below:
https://www.youtube.com/watch?t=30&v=HlBmq70LKrQ
There are two versions of map, the first is the one contains dots in red and black
http://imgur.com/ra566zO
And another version is colorful
http://imgur.com/xn2RG5o
I've tried to find possible topic publishing the colorful version, but I haven't found one. (I've seen /ORB_SLAM/Map_array, but it's a empty topic) Also, I've checked ROS parameters and found nothing related. So I can only get the first version in my Rviz.
http://imgur.com/UDXhtgO
Should I modify the code or do something else to see the colorful version?
Thanks in advance.
1.What is the size of visual dictionary you are using ,i mean how may nodes are there , I haven't seen such a good re localization in any other SLAM algorithm.
2. What does motion model in setting.yaml file tell. Are you using EKF with constant velocity model.
thanks
Hi
I have a question about how to integrate multiple cameras in ORB_SLAM in a way that each can use the map generated from the other one (not generate new map) for localization.
Thanks
hola raúl,
did you ever run into the following error? http://answers.ros.org/question/192272/possible-limitation-of-points-in-markerline_strip-and-rviz/
by now i had this problem several times, mostly this happened after loop closure when my rosbag run the second time. i know its a rviz problem but maybe you solved this already?
Hi,
@raulmur r
when I use your code to compute the relative pose of the first two frames, I always get two very good solutions, both the "bestGood" and "secondBestGood" can reach the amount of the matches
Hi guys,
ORB-SLAM itself seems to be working fine (features tracking well), but I can't visualize the 3D map in rviz at all. I see very weird shapes/polygons instead. Here's a screenshot. Any help would be appreciated.
Thanks,
Sid
Hi, I am recently learning your code,
and I am interested in how to optimize the 3d points after triangulating?
Thanks
I would like to know the reason for picking ORB as the feature descriptor.
There are other rotational and scale invariant binary descriptors.
If I switch ORB to FREAK (or BRISK), will the system work equally well?
The Cmake file provided with the code is useful for only compiling in ros fuerte .Can you provide the files for hydro version.
hi, @raulmur I am ready to learn your algorithm and code,
could you please give me some suggestions about develop tool? or what kinds of tools do you use to develop orbslam (et. vim , emacs or other ides)?
thanks a lot!
Hello
i have remapped the camera topics correctly . but getiing this error
RB-SLAM Copyright (C) 2014 Raul Mur-Artal
This program comes with ABSOLUTELY NO WARRANTY;
This is free software, and you are welcome to redistribute it
under certain conditions. See LICENSE.txt.
Loading ORB Vocabulary. This could take a while.
Wrong path to vocabulary. Path must be absolut or relative to ORB_SLAM package directory.
[orb_slam-3] process has died [pid 30997, exit code 1, cmd /home/farhan/catkin_ws/devel/lib/orb_slam/orb_slam Data/ORBvoc.yml Data/Settings.yaml /camera/image_raw:=/creative_cam/image_color __name:=orb_slam __log:=/home/farhan/.ros/log/2e26604a-16ae-11e5-9789-d8fc9361bda4/orb_slam-3.log].
log file: /home/farhan/.ros/log/2e26604a-16ae-11e5-9789-d8fc9361bda4/orb_slam-3*.log
Thanks
Farhan
I am testing ORB-SLAM (and LSD-SLAM) on the TUM RGB-D Benchmark dataset, and I have some question on your test process:
ORB questions:
LSD-SLAM questions:
Thank you
When I launch ORB-SLAM from the terminal by "rosrun ORB_SLAM ORB_SLAM PATH_TO_VOCABULARY PATH_TO_SETTINGS_FILE",There will be a mistake - [ERROR] [1438841720.919284826]: Wrong path to settings. Path must be absolut or relative to ORB_SLAM package directory .
ps:I have provided the path of ORB_SLAM directory to the environment variable of PATH_TO_SETTINGS_FILE
hi, @raulmur
thanks for your sharing your orb-slam code .sorry to trouble you but I have a trouble while tracking
I firstly go straight towards north and then turn to east
orb-slam can calculate the degrees very well, but the scale of distance changed obviously after turning to east.
the walking distance towards north and towards east are nealy the same, but on the rviz, the distance to east is obviously shorter than that to north
could you give me some suggestions ?
thanks and best
Hello, @raulmur
I am sorry to trouble you but I am facing some problems while porting your code to an arm board.
The 1st problem is that I have no idea how to get the camera pose.
I have seen into your code, and found following lines
cv::Mat Rwc = mCurrentFrame.mTcw.rowRange(0,3).colRange(0,3).t();
cv::Mat twc = -Rwc*mCurrentFrame.mTcw.rowRange(0,3).col(3);
is Rwc
the rotation matrix of the camera in world coordinate system?
and
I am also wondering how to access the (x,y,z)
coordinate of camera in world coordinate system?
is it twc
or mCurrentFrame.mTcw.rowRange(0,3).col(3)
?
The second problem is that how to remove the viewer and rviz of the program. Because I only need the pose of the camera. or how do I get to know all the topics with orbslam?
I am looking forward to hearing from you
best wishes to you
yours, sincerely
Li Qile
Installation: Ubuntu 14.04
ROS: Indigo
Output of $ROS_PACKAGE_PATH=/home/josh/Desktop/ORB-SLAM:/opt/ros/indigo/share:/opt/ros/indigo/stacks
Error occurs in building ORB-SLAM at step 3-6. Output of error:
josh@josh-Surface-with-Windows-8-Pro:~/Desktop/ORB_SLAM/build$ cmake .. -DROS_BUILD_TYPE=RELEASE
[rosbuild] Building package ORB_SLAM
[rosbuild] Error from directory check: /opt/ros/indigo/share/ros/core/rosbuild/bin/check_same_directories.py /home/josh/Desktop/ORB_SLAM
1
Traceback (most recent call last):
File "/opt/ros/indigo/share/ros/core/rosbuild/bin/check_same_directories.py", line 46, in
raise Exception
Exception
CMake Error at /opt/ros/indigo/share/ros/core/rosbuild/private.cmake:102 (message):
[rosbuild] rospack found package "ORB_SLAM" at "", but the current
directory is "/home/josh/Desktop/ORB_SLAM". You should double-check your
ROS_PACKAGE_PATH to ensure that packages are found in the correct
precedence order.
Call Stack (most recent call first):
/opt/ros/indigo/share/ros/core/rosbuild/public.cmake:177 (_rosbuild_check_package_location)
CMakeLists.txt:4 (rosbuild_init)
-- Configuring incomplete, errors occurred!
See also "/home/josh/Desktop/ORB_SLAM/build/CMakeFiles/CMakeOutput.log".
Further research on answers.ros.org indicates that using CMAKE isn't supported, given the error above. (source: http://answers.ros.org/question/65801/ros-inside-part-of-a-c-project/ )
Does anyone have any sort of answers how to make this work? This would be an impressive project to get up and running.
Sincerely,
Josh Conway
In Thirdparty/DBoW2/DBoW2/TemplatedVocabulary.h
I needed to add:
to fix:
error: ‘numeric_limits’ is not a member of ‘std’
Hi,
I was wondering if you could point me to the code you used to train the vocab tree. Is it something you put together or is it part of Dbow2?
Thanks
Nikhil
I'm using a very wide FOV camera (go pro), so I'd like to use the FOV camera model (Deverneay and Faugeras) instead of the provided model that OpenCV uses. I believe I've modified the code correctly, as I've double-checked my math by hand (by observing and verifying the final undistorted coordinates I feed back into the ORB-SLAM code) and the features have no problems tracking near the center of the frame. However, I'm having trouble with disappearing features at the edges of the frame (maybe the first 20% of each edge), so I wanted to check here and see if the authors had maybe implemented some checks to delete features that were corrected by the calibration to be too far outside of the image bounds eg. a feature at (10,10) in an image of size (1920,1080) that was then corrected to (-150,-50).
In case I haven't analyzed the ORB-SLAM code well, I should probably also mention what changes I've made to switch to the FOV model. In Data/Settings.yaml
, I set fx,fy,cx,cy
according to my calibration, and set k1,k2,p1,p2
to 0. I modified the call to cv::undistortPoints
made in Frame.cc:UndistortKeyPoints
and removed the last parameter. This allowed cv::undistortPoints
to return normalized coordinates, which I then fed into my own function to account for radial distortion. I set the contents of mat
to the values my function returned.
Thanks,
Sid
hi, @raulmur how to import the source code of g2o in the orbslam project under a eclipse with cdt?
Hi all,
I had a black window when I did:
rosrun image_view image_view image:=/ORB_SLAM/Frame _autosize:=true, any suggestion please.
PS: I'm using a Logitech camera. My frames.pdf contains "no tf data recieved".
Thank you a lot.
Hello,
I am trying to build ORB_SLAM package. I am using ROS-Indigo on ubuntu 14.04. I get the following error on step 6 (https://github.com/raulmur/ORB_SLAM).
~/ORB_SLAM/build$ cmake .. -DROS_BUILD_TYPE=Release
-- Found PythonInterp: /usr/bin/python (found version "2.7.6")
[rosbuild] Building package ORB_SLAM
Failed to invoke /opt/ros/indigo/bin/rospack deps-manifests ORB_SLAM
[rospack] Error: package 'ORB_SLAM' depends on non-existent package 'opencv2' and rosdep claims that it is not a system dependency. Check the ROS_PACKAGE_PATH or try calling 'rosdep update'
CMake Error at /opt/ros/indigo/share/ros/core/rosbuild/public.cmake:129 (message):
Failed to invoke rospack to get compile flags for package 'ORB_SLAM'. Look above for errors from rospack itself. Aborting. Please fix the broken dependency!
Call Stack (most recent call first): /opt/ros/indigo/share/ros/core/rosbuild/public.cmake:207 (rosbuild_invoke_rospack)
CMakeLists.txt:4 (rosbuild_init)
-- Configuring incomplete, errors occurred!
See also "/home/steve/ORB_SLAM/build/CMakeFiles/CMakeOutput.log".
Can somebody please help me out?
I am trying to replicate and use your work as displayed in said paper and front youtube video.
https://www.youtube.com/watch?v=HlBmq70LKrQ
Have you released this module? If so, how do I call it appropriately? If not, what is your timeframe in which you will release it in this github?
Sincerely,
Josh Conway
when i install ORB_SLAM , i have some problem.
/******************************************************************************
Failed to invoke /opt/ros/jade/bin/rospack deps-manifests ORB_SLAM-master
[rospack] Error: package 'ORB_SLAM-master' depends on non-existent package 'opencv2' and rosdep claims that it is not a system dependency. Check the ROS_PACKAGE_PATH or try calling 'rosdep update'
******************************************************************************/
how can i solve this problem . Thanks.
Dear Raul Mur
Hello, this is Li Qile
I am running your code with my own data recently. but the orb slam cannot initialize, I can only see green lines on the /ORB_SLAM/Frame
May I ask you for some suggestions to fix such situsations?
Thanks a lot
Best
Li Qile
Hi, @raulmur r
I am sorry to trouble you but I am facing a very hard problem
I was using your method to compute the relative pose of the first two frames.
but I always get two solutions, and the value of "bestGood" and "secondBestGood" nealy the same.
I am looking forward to hearing from you and best wishes to you.
thanks in advance
sincerely
Li Qile
While I have managed to get your example file to work, I struggle with providing an ORB vocabulary from my own dataset. I have a set of image files converted to a .bag file.
How do you extract ORB features from an image dataset to provide to ORB-SLAM?
When ORB_SLAM is running, what should I do if I want to go back to the initialization stage ?
Is there a key for restarting the program ?
Hi,
can you show me where I can find Frame , Map,Camera and World because I don't find them under ORB_SLAM (in general I think that I didn't understand those two parts :
2-The last processed frame is published to the topic /ORB_SLAM/Frame. You can visualize it using image_view and number 3).
Thank you a lot.
I have installed ORB_SLAM on the odroid u3.
However, the ORBvoc file is too big to run on the odroid board.
I would appreciate it if you could offer another smaller ORB vocabulary file just for indoor environment.
This is pretty awesome, thanks for sharing!
Is there currently a way to save and load maps?
Hello
Thanks for your sharing of ORB_SLAM
Recently I am trying to run your code with rgb-d dataset
from tum vision
But I don't know how to get the opencv camera calibration
could you please tell me how to access it
thanks in advance.
best
liqile
Hi,
I got that error message during the ORB_SLAM make. I've done everything like in the manual.
Regards,
husdo
/opt/ros/indigo/share/ORB_SLAM/Thirdparty/DBoW2/DBoW2/TemplatedVocabulary.h: In instantiation of ‘DBoW2::TemplatedVocabulary<TDescriptor, F>::~TemplatedVocabulary() [with TDescriptor = cv::Mat; F = DBoW2::FORB]’:
/opt/ros/indigo/share/ORB_SLAM/src/main.cc:86:29: required from here
/opt/ros/indigo/share/ORB_SLAM/Thirdparty/DBoW2/DBoW2/TemplatedVocabulary.h:510:3: warning: deleting object of abstract class type ‘DBoW2::GeneralScoring’ which has non-virtual destructor will cause undefined behaviour [-Wdelete-non-virtual-dtor]
delete m_scoring_object;
^
/opt/ros/indigo/share/ORB_SLAM/Thirdparty/DBoW2/DBoW2/TemplatedVocabulary.h: In instantiation of ‘void DBoW2::TemplatedVocabulary<TDescriptor, F>::createScoringObject() [with TDescriptor = cv::Mat; F = DBoW2::FORB]’:
/opt/ros/indigo/share/ORB_SLAM/Thirdparty/DBoW2/DBoW2/TemplatedVocabulary.h:420:23: required from ‘DBoW2::TemplatedVocabulary<TDescriptor, F>::TemplatedVocabulary(int, int, DBoW2::WeightingType, DBoW2::ScoringType) [with TDescriptor = cv::Mat; F = DBoW2::FORB]’
/opt/ros/indigo/share/ORB_SLAM/src/main.cc:86:29: required from here
/opt/ros/indigo/share/ORB_SLAM/Thirdparty/DBoW2/DBoW2/TemplatedVocabulary.h:446:3: warning: deleting object of abstract class type ‘DBoW2::GeneralScoring’ which has non-virtual destructor will cause undefined behaviour [-Wdelete-non-virtual-dtor]
delete m_scoring_object;
^
[ 11%] Building CXX object CMakeFiles/ORB_SLAM.dir/src/Tracking.cc.o
[ 16%] Building CXX object CMakeFiles/ORB_SLAM.dir/src/LocalMapping.cc.o
[ 22%] Building CXX object CMakeFiles/ORB_SLAM.dir/src/LoopClosing.cc.o
[ 27%] Building CXX object CMakeFiles/ORB_SLAM.dir/src/ORBextractor.cc.o
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc: In member function ‘void ORB_SLAM::ORBextractor::ComputeKeyPoints(std::vectorstd::vector<cv::KeyPoint >&)’:
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc:606:63: error: ‘FAST’ was not declared in this scope
FAST(cellImage,cellKeyPoints[i][j],fastTh,true);
^
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc:615:34: error: ‘ORB’ has not been declared
if( scoreType == ORB::HARRIS_SCORE )
^
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc:682:17: error: ‘KeyPointsFilter’ has not been declared
KeyPointsFilter::retainBest(keysCell,nToRetain[i][j]);
^
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc:698:13: error: ‘KeyPointsFilter’ has not been declared
KeyPointsFilter::retainBest(keypoints,nDesiredFeatures);
^
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc: In member function ‘void ORB_SLAM::ORBextractor::operator()(cv::InputArray, cv::InputArray, std::vectorcv::KeyPoint&, cv::OutputArray)’:
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc:759:82: error: ‘GaussianBlur’ was not declared in this scope
GaussianBlur(workingMat, workingMat, Size(7, 7), 2, 2, BORDER_REFLECT_101);
^
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc: In member function ‘void ORB_SLAM::ORBextractor::ComputePyramid(cv::Mat, cv::Mat)’:
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc:799:78: error: ‘INTER_LINEAR’ was not declared in this scope
resize(mvImagePyramid[level-1], mvImagePyramid[level], sz, 0, 0, INTER_LINEAR);
^
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc:799:90: error: ‘resize’ was not declared in this scope
resize(mvImagePyramid[level-1], mvImagePyramid[level], sz, 0, 0, INTER_LINEAR);
^
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc:802:80: error: ‘INTER_NEAREST’ was not declared in this scope
resize(mvMaskPyramid[level-1], mvMaskPyramid[level], sz, 0, 0, INTER_NEAREST);
^
make[2]: *** [CMakeFiles/ORB_SLAM.dir/src/ORBextractor.cc.o] Error 1
make[1]: *** [CMakeFiles/ORB_SLAM.dir/all] Error 2
make: *** [all] Error 2
hi, I am running your code with dataset from tum-vision
but an error occured, and the process died.
[ INFO] [1429678011.002510623]: New Map created with 425 points /home/hongbohuang/catkin_ws/src/ORB_SLAM/bin/ORB_SLAM: symbol lookup error: /home/hongbohuang/catkin_ws/src/ORB_SLAM/bin/ORB_SLAM: undefined symbol: _ZN3g2o17EdgeSE3ProjectXYZC1Ev
[ORB_SLAM-3] process has died [pid 13939, exit code 127, cmd /home/hongbohuang/catkin_ws/src/ORB_SLAM/bin/ORB_SLAM Data/ORBvoc.yml Data/Settings.yaml __name:=ORB_SLAM __log:=/home/hongbohuang/.ros/log/6cfb0986-e893-11e4-be96-bcaec584f5df/ORB_SLAM-3.log]. log file: /home/hongbohuang/.ros/log/6cfb0986-e893-11e4-be96-bcaec584f5df/ORB_SLAM-3*.log
could you please tell me what's wrong with my options? thanks in advance
best
Hi,
when using ORB-SLAM in our applications (onboard an MAV) everything works fine until about 5 minutes into flight. Usually around that time, the ORB-SLAM process dies. Is there a way to see/check why it dies? Is it a memory issue? It just gives us exit code -9
I have been trying to find any kind of reference to answer this question, but couldn't find anything neither in the paper nor here on github. Is the image set available somewhere?
-Marc
Hi,
Can you tell me please witch camera (and other equipments,PC.... ) did you use, thank you.
Best regards.
thank you for your sharing code. i want to run the code on phone.
can you give me some ideas?
Dear all,
I had installed ORB_SLAM and run the example successfully. Now I turn into having a test on KITTI VO seq 00, like authors did in their paper.
But when I start ORB_SLAM on KITTI VO seq 00, I just receive "Not initialized -> Trying to initialize", and seeing street-view images with green edges on it in the /ORB_SLAM/Frame, please the link:
https://www.dropbox.com/s/0oi3iu3nqzwpv1h/47.png?dl=0
Could someone give me some hints?
Thanks in advance~
Milton
Hello,
I am currently using Ubuntu 14.04 + ROS Indigo. I have cloned the ORB_SLAM repository to my catkin workspace and all of the necessary dependencies are installed, but when I launch "ExampleGroovyHydro.launch" then I get the following error:
ERROR: cannot launch node of type [ORB_SLAM/ORB_SLAM]: can't locate node [ORB_SLAM] in package [ORB_SLAM]
And indeed I cannot find the ORB_SLAM node within the package. Also when I run the following instruction: rosrun ORB_SLAM ORB_SLAM PATH_TO_VOCABULARY PATH_TO_SETTINGS_FILE, I get:
[rosrun] Couldn't find executable named ORB_SLAM below /home/nils/catkin_ws/src/ORB_SLAM.
Does anyone know how I could solve this error? I would appreciate it a lot. Regards.
Hello,
It would be interesting to combine this SLAM system with a feature detector and descriptor other than ORB, primarily to compare performance in special use cases, such as environments with few corners. However the ORB version implemented here is greatly extended with specific functionality which probably wouldn't be easy to convert.
Hello, I am looking into your code recently
But I could not understand how to compute the relative pose between the first two frames.
could you please give me some references? thanks
Hi Raul,
Thanks for sharing the code.
I want to save the camera pose of all the frames, not only for the key frames. Could you please tell how to do this?
thanks,
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.