-
Notifications
You must be signed in to change notification settings - Fork 4.9k
Description
| Required Info | |
|---|---|
| Camera Model | D435 |
| Firmware Version | librealsense 2.8.0 |
| Operating System & Version | Windows 10 |
| CPU | Intel Core i7 860 (2.8GHz) |
First off: The RealSense Viewer that came with the SDK doesn't have this problem for some reason.
But compiling the samples on my pc and writing my own code results in CPU usage >90%.
Compiling the capture example results in CPU spikes of 100% with frames actually being displayed, followed by CPU utilization of 0% and with no new frames being displayed.
But I think that is because of my slow USB 3.0 ports (PCIe extention card) of the workstation I'm on. The RealSense Viewer also displays no frame data if I activate multiple streams with medium or one stream with high resolution.
When I now edit the rs-capture.cpp example and only activate a 640x480 color stream my USB connection has no trouble keeping up anymore. But that also means the CPU utilization is constantly >90%.
rs2::context* ctx = new rs2::context();
rs2::config* conf = new rs2::config();
conf->disable_all_streams();
conf->enable_stream(RS2_STREAM_COLOR, 0, 640, 480, RS2_FORMAT_RGB8, 30);
rs2::pipeline* pipe = new rs2::pipeline(*ctx);
pipe->start(*conf);
We also tried to run the D435 on other hardware with a better USB connection. This of course fixes the problem with only being able to run low res streams, but the CPU utilization is about equally high on those maschines as well.
Profiling the example shows around 70% of the time is spent evaluating while(app). Or to be more precise glfwSwapBuffers(win); which doesn't really make sense to me. So I deactivated all the visualisation code and only gather frames in a while loop. Profiling that code still results in CPU >90% with over 95% of that time spent inside realsense2.dll.
#include <librealsense2/rs.hpp>
int main(int argc, char * argv[]) try
{
rs2::colorizer color_map;
rs2::context* ctx = new rs2::context();
rs2::config* conf = new rs2::config();
conf->disable_all_streams();
conf->enable_stream(RS2_STREAM_COLOR, 0, 640, 480, RS2_FORMAT_RGB8, 30);
rs2::pipeline* pipe = new rs2::pipeline(*ctx);
pipe->start(*conf);
while(true)
{
rs2::frameset data = pipe->wait_for_frames();
rs2::frame depth = color_map(data.get_depth_frame());
rs2::frame color = data.get_color_frame();
}
return EXIT_SUCCESS;
}
Any Idea why that is, and why the RealSense Viewer manages to display the same streams with only ~5% CPU?