Comments (17)
@raininglixinyu
Hi, I think you are right. Using DispatchTime
here is inappropriate since it's not necessarily a Dispatch
job (Execute code concurrently on multicore hardware by submitting work to dispatch queues managed by the system).
A few tests were done by one of my friend and hope it may help you:
Testing CACurrentMediaTime
and DispatchTime
on playground or simulator, the results are the same;
While testing with a fresh new project or this project on a device, CACurrentMediaTime = 41.6* DispatchTime
.
It's quite interesting and we are still trying to figure out why there's a difference. But obviously, CACurrentMediaTime
is the appropriate method to measure the time.
And while running on an iPhone 7 plus, the live mode is working with round 120ms
/frame instead of 3ms
/frame.
Sorry about the potential confuse I brought and I corrected all branches.
It looks like your research/project is quite concerned about the time and here are some articles I can find which may help you understand the differences:
- http://stackoverflow.com/questions/41173525/inaccurate-dispatchtime-now-in-swift-3-0
- http://stackoverflow.com/questions/5748700/iphone-a-question-about-dispatch-timers-and-timing
- http://nshipster.com/benchmarking/
from caffe2-ios.
@KleinYuan Thank you for paying close attention to my question, I really appreciate it. And i think this issue can be closed now! Appreciate it!
from caffe2-ios.
@raininglixinyu hey man, thanks for your appreciations. I haven't tested it that specifically cuz I don't have an iPhone 6. But I can add a time consumption feature for real time detector, which we already have in a developing branch (feature-models), so that you can evaluate it better? Would that help?
from caffe2-ios.
@raininglixinyu I just added the feature of showing time consumption for the classifier for you in this branch here and please check it out.
For further details, you can check the code.
I think this should be enough for you to test whatever images on whatever iOS devices.
from caffe2-ios.
@KleinYuan Hi, actually i have tested it on my iphone6, but the performance is quiet different with https://github.com/RobertBiehl/caffe2-ios witch takes only 6ms on iphone6(according to the author's description). So i am confused.
from caffe2-ios.
@raininglixinyu resizing image also takes time (and sometimes, takes most of the time) and it's hard to say for all images, only 6ms will be spent. If you test with same image size, should be similar.
from caffe2-ios.
@KleinYuan I think that i just benchmark the predict step. The code is
double start = clock();
_predictor->run(input_vec, &output_vec);
double end = clock();
printf("The time was: %f\n\n", (double)(end - start) / CLOCKS_PER_SEC);
from caffe2-ios.
@raininglixinyu ahhh, I think I can see what you mean now. It seems you got a much bigger number than the case we count in the swift code, right?
from caffe2-ios.
@KleinYuan yeah! You are right, Is there something important i missed?
from caffe2-ios.
@raininglixinyu
I doubt whether it's because the function you use may loose some precision? If you check the swift code, I directly use DispatchTime
which provides nanosecond precision.
I think you need a nanosecond precision function to get the correct value.
Found this may help you : http://stackoverflow.com/questions/10192903/time-in-milliseconds
from caffe2-ios.
@KleinYuan Hi, The function DispatchTime
may not be the appropriate way to get the correct value. I have tested the DispatchTime
with the CACurrentMediaTime
in a same project and the results show that there is a great deal of difference between the two ways.
from caffe2-ios.
@raininglixinyu This article seems have a legit experiment on those two methods and it seems the difference is no that large. But sure, definition of being large depends on the required precision. Also why do you think using DispatchTime is not appropriate? The official document does not seem saying that: https://developer.apple.com/reference/dispatch/dispatchtime
Could you elaborate your tests a bit? Just curious
from caffe2-ios.
@raininglixinyu Here's a good explaination on why we got CACurrentMediaTime = 41.6* DispatchTime
on device and in short:
By doing s_timebase_info.numer / s_timebase_info.denom
, you can get a rate between Mach absolute time and nanoseconds, which is 125/3 = 41.6666667.
#include <mach/mach_time.h>
int getUptimeInMilliseconds()
{
const int64_t kOneMillion = 1000 * 1000;
static mach_timebase_info_data_t s_timebase_info;
if (s_timebase_info.denom == 0) {
(void) mach_timebase_info(&s_timebase_info);
}
// mach_absolute_time() returns billionth of seconds,
// so divide by one million to get milliseconds
return (int)((mach_absolute_time() * s_timebase_info.numer) / (kOneMillion * s_timebase_info.denom));
}
Also check this experiment from Apple.
Credit goes to @VinsonLi
from caffe2-ios.
@KleinYuan If so, let's update the "real time classifer" snapshot (which says only costs 3ms), to avoid confusion.
from caffe2-ios.
@austingg sure thing, will do. I was planning to buy a hotdog and took a snapshot.
from caffe2-ios.
@KleinYuan ahaha, great works ! very useful demo !
I will try NNPACK on iOS.
from caffe2-ios.
@austingg welcome to submit any PR :)
from caffe2-ios.
Related Issues (20)
- #include "caffe2/proto/caffe2.pb.h" not found HOT 7
- Output for YOLO and LeNet HOT 2
- How to change the default compute mode from eigen to nnpack? HOT 6
- error: **/src/caffe2-ios/caffe2-ios/tinyYoloPredict.pb: No such file or directory HOT 2
- Compilation error for release HOT 2
- I got a issues when build the target HOT 2
- How to use other model? HOT 3
- Would this work for ios11? HOT 3
- Compile Error : linker command failed with exit code 1 HOT 3
- Apple Mach-O Linker Error HOT 4
- 'caffe2/core/predictor.h' file not found HOT 1
- #include "caffe2/proto/caffe2.pb.h" not found (version 2) HOT 1
- Apple Mach-O Linker Error
- How to build an image dataset for use in iOS HOT 3
- Build caffe2 failed when click the setup.sh
- Undefined symbols for architechure x86_64
- Use pytorch generated caffe2 iOS build
- Did you tested the performance of the model? Such as the time cost of the predict step? HOT 5
- Why is openCV used in here? HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from caffe2-ios.