简体   繁体   English

如何在GPU编程中使用GPU

[英]How can I use GPU with Java programming

I am using CUDAC all these days to access the GPU. 这些天我一直在使用CUDAC来访问GPU。 But now my guide asked me to work with Java and GPU. 但现在我的导游让我使用Java和GPU。 So I searched in Internet and found Rootbeer is the best option for it but I am not able to understand how to run a program using 'Rootbeer'. 所以我在互联网上搜索并发现Rootbeer是最好的选择,但是我无法理解如何使用'Rootbeer'来运行程序。 Can some one tell me steps for using Rootbeer . 有人可以告诉我使用Rootbeer的步骤

Mark Harris from Nvidia gave nice talk about the future of CUDA at SC14. 来自Nvidia的Mark Harris在SC14上谈到了CUDA的未来。 You can watch it here . 你可以在这里观看。

The main thing that may be of interest for you is the part where he talks about programming languages and especially Java. 您可能感兴趣的主要内容是他谈论编程语言尤其是Java的部分。 IBM is working on CUDA4J and there are some nice plans about Java 8 features especially lambdas to be used for GPU programming. IBM正在研究CUDA4J,并且有一些关于Java 8特性的好计划,特别是用于GPU编程的lambdas。 However, I am not a Java user and I can't answer your question regarding Rootbeer (besides the taste) but maybe CUDA4J will be something that suits you. 但是,我不是Java用户,我不能回答你关于Rootbeer的问题(除了味道),但也许CUDA4J会适合你。 Especially, if you know how to write CUDA C and need a solution backed up by a company like IBM. 特别是,如果您知道如何编写CUDA C并需要IBM等公司提供的解决方案。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM