There are a number of very old threads on CUDA so I'm starting a new one rather than resurrecting an old one. Does anyone here have any experience setting up and developing in CUDA on a Windows platform? I know that there is a system that can be set up with Visual Studio/CUDA that uses the nvidia hardware but I am hoping to find something for AMD/Radeon. I have found something called [ocelot](https://code.google.com/archive/p/gpuocelot/) that supposedly works with AMD/Radeon (which I have on my laptop) but there are issues: 1. It has the smell of abandonment (hasn't been updated since 2013) …

Member Avatar
Member Avatar
+0 forum 11

Hello, I was wondering wich GTX 760 would be the best in an SLI combination with the Gigabyte GTX 760 Windforcer 3x? Thats because the windforcer 3x edition isn't available anymore. Help would be appreciated. Thanks in advance. Jeannot Thewissen

Member Avatar
Member Avatar
+0 forum 1

I've been editing on a (windows based) desktop and now want and need to transition to notebook computers. I've been assured that notebook computers are now showing specs that can rival any desktop based system (if you have the $$$). Specially HP's Z-Book series which is built with video and content creation in mind. However, due to budget constraints I am looking at the Acer Aspire V17 Nitro Black Edition as a possible candidate. It sports a fast i7 processor, with 16GB RAM and an Nvidia GTX-860M video card. My question is that do the more compact M-series of Nvidia …

Member Avatar
Member Avatar
+0 forum 1

I am proceeding with parallel processing using GPU, while installing CUDA, I did build the dll files in CMAKE to have GPU support. I did include all the CUDA files and now when I am trying to rebuild the following solution: #include <iostream> #include <stdio.h> #include "C:\Users\admin\Documents\opencv\build\include\opencv2\opencv.hpp" #include "C:\Users\admin\Documents\opencv\build\include\opencv2\gpu\gpu.hpp" using namespace std; int main (int argc, char* argv[]) { try { cv::Mat src_host = cv::imread("image2.jpg", CV_LOAD_IMAGE_GRAYSCALE); //read in image cv::gpu::GpuMat dst, src; //allocate space src.upload(src_host); //upload cv::gpu::threshold(src, dst, 128.0, 255.0, CV_THRESH_BINARY); //threshold cv::Mat result_host; dst.download(result_host); cv::imshow("Result", result_host); //display cv::waitKey(5000); } catch(const cv::Exception& ex) { std::cout << "Error: " << ex.what() …

Member Avatar
Member Avatar
+0 forum 1

I need detect GPU (videocard) and set settings of the app, appropriate to GPU performance. I'm able to make a list with settings for each GPU model, but I don't understand how to easily detect model of GPU installed in PC. What is the best way to solve this task? Does any way to do this that is not dependent on installed driver?

Member Avatar
Member Avatar
+0 forum 2

HI I am trying to make a dll file that performs its tasks using CUDA library. Here is a simple version of my code: CUDADll.cu ` #include <iostream> #include "cuda.h" #include "cuda_runtime.h" #include "device_launch_parameters.h" __global__ void test(int a, int b, int* c) { *c = a+b; } extern "C" __declspec(dllexport) int sumTEst(int a, int b) { int c = 0; test<<<1,1>>>(a, b, &c); return c; } Inline Code Example Here ` When I compile this file, there is no problem. But when I build the project which has only file above, an error occurs: Error 1 error LNK1561: entry point …

Member Avatar
Member Avatar
+0 forum 2

Hi all, Some laptops come with dedicated gpus while some laptops come with integrated gpu. I bought a Sony Vaio laptop with Intel core i5-2450 and Nvidia graphics card.... Does my laptop graphics( like windows aero) work even If I disable the Nvidia driver.... What do I have to in order to make it work like that..... Does it work if I install Intel graphics drivers.....? Thanks in advance...

Member Avatar
Member Avatar
+0 forum 6

I recently got blue screened about four times. Some time later, i noticed that the nVidia GPU(630M), which is used with Optimus along with the onboard graphics, isn't being detected by the computer. I just want to know if there is some way to maybe make sure that the GPU is dead before doing anything else about it, maybe through the BIOS or some other method. I no longer have the crash dumps however. A couple specs: Microsoft Windows 7 Home Premium 64-bit SP1 Intel Core i5-3210 @ 2.50GHz 6.00 GB Dual-Channel DDR3 @ 798MHz And the GPU is an …

Member Avatar
Member Avatar
+1 forum 3

Hi, I am implementing an applicaiton which uses CPU but having a problem about allocating in GPU memory. Above all, my work environment is Windows 7 Visual Studio 2010 NVIDIA Quadro NVS 290 (256MB) When I run a piece of CUDA code: cudaMemGetInfo(&freeMem, &totalMem); I get 256MB of total memory and only 3-10MB of free memory. Because there is not much free space, my application can't successfully allocate in GPU memory mostly - outOfMemory error keeps being thrown. After crazy googling, I found out Windows 7 already allocated about 95% of GPU memory for its theme manager and others.But I …

Member Avatar
Member Avatar
+0 forum 1

(My apologies if this is the wrong forum section) Hello everyone, I have recently been confronted by two problems with my computer, which I purchased two years ago. These are my specs Quadcore Intel i7 860 @ 2.80Ghz ATi Radeon HD 5770 1GB 4GB of DDR-3 RAM 500 GB SATA hard drive 600w PSU Running Windows 8 Consumer Preview Whenever rendering videos (Adobe Premire & Sony Vegas) or playing games (Battlefield 3) my computer suddenly shuts off. I thought it might be a heat problem so I ran Real Temp while playing BF3, and saw that the tempereature on all …

Member Avatar
Member Avatar
+0 forum 6

Okay so a few weeks ago I started using my HDTV as one of my monitors on my GTX 285 graphics card. Then one morining I put the system to sleep woke up to turn it on. All of a sudden a fan (which I am almost 100% sure is the GPU's) is loud. I think it's spinning at 100%, the thing is it won't die down. The other problem is I have no monitor images. I mean I tried using just my 20" monitors, my tv, using both pots, only 1 screen, and I get nothing on them (I …

Member Avatar
Member Avatar
+0 forum 3

[ATTACH=RIGHT]18689[/ATTACH]DisplayLink is a combination hardware/software technology that enables users to connect almost any kind of monitor to a PC via USB. It is designed to make adding additional monitors very easy as well. You can add up to 6 displays without installing another video card. Simply install the DisplayLink software, plug the new display into your USB port or hub and you are set. The technology is also designed to act as a docking station, adding more USB, audio and network ports. Currently DisplayLink technology can be found in number of USB docking stations (like this one from HP), UBS …

Member Avatar
Member Avatar
+0 forum 1

[ATTACH]14834[/ATTACH] This is the temps of my computer (the green line up there is the gpu) after about five minutes of crysis at medium settings no aa res: 1024x768 , i have a nVidia GEforce gt220 and i noticed that it has no fan at all, first when i bought the computer i didn´t understand any of this, but im starting to get a hang of it now. my specs are: 4 CPUs at 2.3 GHz, 6144Mb RAM and windows 7 64 bit at first i could play lots of games without any problems, nowdays i can play for a …

Member Avatar
Member Avatar
+0 forum 16

[ATTACH=RIGHT]16729[/ATTACH]Intel’s latest line of processors, code named “Sandy Bridge”, was designed with intense multimedia demands in mind. Although we won’t likely see a sandy bridge chip on the market until early 2011, we have learned a good deal of information in regards to the architecture of the processor. The greatest leap in design is the debut of silicon that is devoted to transcoding data. Think data conversion. An everyday example would be the transcoding of audio and video data when playing a media file that requires a special codec (QuickTime, DivX, i.e.) for playback or converting something between two formats, …

Member Avatar
Member Avatar
+0 forum 1

Would appreciate a fast reply on this one, guys. I have a PC with a DG35EC mobo, C2Q Q6600 2.4GHZ processor, 4GB DDR2 800MHz RAM, 500GB Seagate HDD, running on 64-bit Windows 7 Ultimate. The PSU is a Prime Source Tech AX-400S with +3.3V - 30.0A, +5V - 28.0A, +12V1 - 14.0A, +12V2 - 16.0A, -12V - 0.5A, +5VSB - 2.5A. I have now brought a XFX HD 4670 1GB DDR3 card (had a 8500GT before which worked great). My PC refuses to recognize the card and boots up straight to Win7 using onboard graphics. After being through many sites …

Member Avatar
Member Avatar
+0 forum 3

I disassembled an Asus A7S-A1 2 nights ago for a BGA reflow. Following reflow I inserted a stick of RAM, had the heatsinks re-seated with fan, attached the power button to motherboard and then connected the battery to test if reflow attempt was succesfull... good so far but heres where it all went wrong, lastly I connected the display data/video cable to the board, and a faint pop then burning smell and now no power AT ALL, no post, nothing. Obviously I should have had everything connected AND THEN lastly the battery. Anyway thats where I am now. It wasn't …

Member Avatar
Member Avatar
+0 forum 6

Hi, Please could you help me with this problem. I have purchased a laptop for my son from (wish i'd bought new) Ebay (my advise, don't collect an item offering shipping it voids your paypal protection, apparently). Originally we were experiencing a problem with the battery not taking the charge and dying within moments of the power supply being removed. Then within a few hours any reboots were causing the laptop to stall on boot, with a series of beeps that meant media failure. We overcame this by rebooting resitting the battery and power wire and all was fine again …

Member Avatar
Member Avatar
+0 forum 1

Every August, the Institute of Electrical and Electronics Engineers (IEEE) holds a two-day symposium known as Hot Chips on the campus of Stanford University. If you've ever walked around the Stanford campus in August, you'd know how appropriate that term is. It's a gathering of some very smart people with very heavy accents, talking several levels above most people's understanding of microprocessor technology. After all, many of the presenters hold doctorates in electrical engineering and computer science. The show tends to be a who's-who in the chip field, and this year's show was no exception: Intel (NASDAQ: INTC), AMD (NYSE: …

Member Avatar
+1 forum 0

[B]ATI Takes the Lead[/B] [ATTACH=RIGHT]16273[/ATTACH]After a long stretch of dominance, NVIDIA has fallen behind ATI in discrete GPU shipments. According to Mercury Research, ATI claims 51% of the market, and NVIDIA has slipped to, you guessed it, 49%. NVIDIA was the leader this time last year with 59% market share, illustrating about-face in overall shipments. ATI also took the 2nd place position from NVIDIA in the integrated GPU market (IGP), pulling in 24.5% compared to NVIDIA's 19.8%. Intel has traditionally led the pack, and clocked in last quarter at a healthy 54% of IGP shipments. These are no doubt tough …

Member Avatar
+1 forum 0

I have heard you can efficiently use your GPU to do fast floating point calculations. This can be done using OpenGL but you may have to use the library in an unorthodox way or graphics card toolkits such as CUDA (by nVidia) which I guess is Graphics card specific. Does anyone have an experience in this and an opinion on the best tools available now. Thanks!

Member Avatar
Member Avatar
+0 forum 1

I have heard you can efficiently use your GPU to do fast floating point calculations. This can be done using OpenGL but you may have to use the library in an unorthodox way or graphics card toolkits such as CUDA (by nVidia) which I guess is Graphics card specific. Does anyone have an experience in this and an opinion on the best tools available now. Thanks!

Member Avatar
+0 forum 0

[ATTACH=right]15788[/ATTACH]High-end video accelerator tools now make password recovery twentyfold faster than the current top of the line quad-core CPUs. Unlike current recovery tools, such as those offered by Intel, [URL="http://www.nvidia.com/object/fermi_architecture.html"]these new Fermi-based video boards[/URL] exceed benchmark speeds of even previous NVIDIA Tesla solutions based on the previous-generation chipsets. CPUs will still prevail, but GPUs found in modern video cards such as NVIDIA GTX 480 are becoming the new standard. Depending on applications, CUDA-based GPUs can contribute more to the total performance of the PC than the central processing unit. In some applications, their role is greater than the role of …

Member Avatar
+0 forum 0

Nvidia yesterday began shipping the [url=http://www.nvidia.com/object/product_geforce_gtx_480_us.html]GeForce GTX 480[/url], with double the number of processor cores previously available at 512. The card is intended to to permit smooth playback and editing of 2D or 3D content, including animation, games and videos. The GTX 480 supports Microsoft DirectX 11, Direct3D and [url=http://en.wikipedia.org/wiki/Physx]PhysX[/url], Nvidia's real-time physics engine and SDK that came along with its acquisition of Ageia in 2008. Now tailored for Nvidia's CUDA GPUs, PhysX off-loads physics processing tasks from a system's main CPU. The board uses NVidia's latest graphics processing architecture--called [url=http://en.wikipedia.org/wiki/GeForce_400_Series]Fermi[/url] after nuclear reactor inventor [url=http://en.wikipedia.org/wiki/Enrico_Fermi]Enrico Fermi[/url]--which the company says …

Member Avatar
+0 forum 0

Hi Guys, I am starting this thread so i can get some potentially helpful information regarding a new computer i want to buy. I want to get a custom built but i don't know how to build one my self. I have a budget of around $3000 (AUD) and i plan on using my PC for a variety of things like gaming and programming, i already have an idea on the GPU and CPU i want and they are the i7 920, and also the ATI Radeon 5790 but unfortunately i don't think thins GPU will fit into my budget. …

Member Avatar
Member Avatar
+0 forum 3

A week ago I was reporting how 'leaked' documents were suggesting that the PlayStation 4 would [URL="http://www.daniweb.com/blogs/entry3890.html"]build upon existing CELL processor architecture[/URL] with the Cell Broadband Engine (Cell BE). At the time I said that this was good news, not bad, in that it meant costs should be kept down and that will be important if Sony is to avoid the pricing mistakes it has made with the PS3 which have effectively hamstrung any chance of gaming market dominance despite the undoubted superiority of the hardware compared to other consoles. Now, with one eye still firmly on the [URL="http://www.itwire.com/content/view/22576/532/"]likely release …

Member Avatar
Member Avatar
+0 forum 2

The End.