tag:blogger.com,1999:blog-3779956188045272690.post5301437729431121870..comments2024-03-22T01:46:59.425-04:00Comments on Procedural World: Displacement and BeyondMiguel Ceperohttp://www.blogger.com/profile/17586513342346629237noreply@blogger.comBlogger14125tag:blogger.com,1999:blog-3779956188045272690.post-67809557670921525582016-04-18T18:21:57.267-04:002016-04-18T18:21:57.267-04:00i need bi-weekly news!!i need bi-weekly news!!Anonymoushttps://www.blogger.com/profile/02841122186021304017noreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-34805672813089716502016-04-13T07:41:21.902-04:002016-04-13T07:41:21.902-04:00My actual remarks are you cannot have interesting ...My actual remarks are you cannot have interesting planets generated in realtime at spaceship approaching speeds and using consumer grade hardware. Does not really matter if you do it top-down, although like you say top-down is likely to produce better results.<br /><br />It is really about informational entropy. It does not matter where it comes from, it could be man-made, automatically generated or even sampled from the real world. It takes *work* to produce entropy, there is no escape. Realtime generation aims to skip this work by using low entropy methods like local mathematical functions. Crappy is in the eye of the beholder, I will just say that line of research is not very interesting to me.<br /><br />With enough time and energy, it is possible to automate the creation of very rich environments. This is where I want to spend my time.Miguel Ceperohttps://www.blogger.com/profile/17586513342346629237noreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-9552550589700232032016-04-12T23:34:52.204-04:002016-04-12T23:34:52.204-04:00I've been watching how Space Engine generates ...I've been watching how Space Engine generates planetary features on a grand scale. It's more natural looking at the planetary scale than the local scale, but I want to think it might be one way to approach a part of the design problem here, driving features top-down by planet type/conditions. Granted, I don't have first hand experience with the problem you often mention of "total automation having a high garbage:realistic ratio." Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-36026659145164687732016-04-05T07:18:06.222-04:002016-04-05T07:18:06.222-04:003D Rendering Services 3D Rendering don’t need to b...<a href="http://reality-forge3d.com" rel="nofollow">3D Rendering Services</a> 3D Rendering don’t need to be expensive, Reality-forge offers photo-realistic rendering services for a very affordable price. You can get a render of your project for as low as $180, and you can even see your designs come to life even before you pay!! Architectural 3d rendering is not a job for amateurs; it takes professionals to do it accurately. It requires hard-earned experience and skills to transform something into a 3 dimensional illustration. No other service on the web that offer a more cost-effective price with high end quality.<br />Anonymoushttps://www.blogger.com/profile/05897459235431476713noreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-24652140232694738022016-03-24T08:18:28.489-04:002016-03-24T08:18:28.489-04:00The key reason why not is performance. Each metama...The key reason why not is performance. Each metamaterial needs to be transformed on the fly, in realtime. It has to be applied along the surface direction, which involves rotation. Also it has to be stretched and scaled in order to provide more diversity. These transforms are expensive to compute with voxels at the moment.<br /><br />But yes, volumetric content for the metamaterials is eventually the right answer.Miguel Ceperohttps://www.blogger.com/profile/17586513342346629237noreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-84231389385043585692016-03-23T17:38:00.541-04:002016-03-23T17:38:00.541-04:00otoh , using a deep learning network you can feed ...otoh , using a deep learning network you can feed in erosion pictures and get out erosion (or even for instance extremely complex things like animated weather or sun activity) effects that would potentially be non computable in any sane amount of time. even though it is not physically accurate (the algorithim is just mimicking pictures) the end result can potentially be more accurate than you could get writing a physically accurate algorithim by hand. Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-18139747943747817782016-03-23T17:34:30.178-04:002016-03-23T17:34:30.178-04:00so the obvious question , why not move to displace...so the obvious question , why not move to displacement voxels instead of displacement maps.. tie this in with detail levels and wammo? displacement voxels could probably make for interesting procedural meta materials so you could have volumetric bodies and such. pretty amazing developments though, very inspiring!Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-50711336818340199622016-03-23T10:02:49.559-04:002016-03-23T10:02:49.559-04:00I see ai and procgen both as branching/expanding s...I see ai and procgen both as branching/expanding subjects, although ai is growing at a much faster rate. Eventually, when the connection is made between the two, and ai gets the ability to use procgen as an output we will see a huge boost to procgen.Ajmhttps://www.blogger.com/profile/12985928729302303917noreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-56818132668740286352016-03-23T09:21:02.699-04:002016-03-23T09:21:02.699-04:00I believe in the really long term, procedural gene...I believe in the really long term, procedural generation is not a thing. There is only AI. This is our mantra: Automate the artist not the subject. Like you say, a human painter will recall how erosion looks instead of running erosion filters in his/her mind.<br /><br />Today both AI and ProcGen are a collection of very domain specific tools. There is little common ground among AI techniques. I'd say there is less common ground between AI and ProcGen.Miguel Ceperohttps://www.blogger.com/profile/17586513342346629237noreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-45233673703508256732016-03-23T00:29:57.277-04:002016-03-23T00:29:57.277-04:00I was thinking, there is another technology I know...I was thinking, there is another technology I know of that is structured very similarily to this. AI, some of the most powerful ai applications developed to date use layers to help interpret pictures/sounds/videos. I find the similarities between what you are doing and how ai operates astounding. You are essentially working ai backwards to create worlds/things/etc.<br /><br />Perhaps, and I am just thinking out loud here, but maybe it would be possible to incorporate ai into procedural generation. That way instead of feeding code to generate models/terrain/etc, you could program the ai to generate code based on examples of code you feed it and examples of real pictures or hand crafted models to give the ai goals.Ajmhttps://www.blogger.com/profile/12985928729302303917noreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-4864484977827613822016-03-22T11:54:39.380-04:002016-03-22T11:54:39.380-04:00In this implementation distance does not matter at...In this implementation distance does not matter at all, so yes, biomes would be generated. Generation happens at the target resolution. If the planet is too far away, you may get only a very coarse depiction of the biomes. Distance becomes a factor when you look at the specifics of rendering. At some distances, where the geometry may not be important, you could feed the meta data to the shaders and bump up the perceived amount of detail while skipping the entire voxel part of the pipeline.Miguel Ceperohttps://www.blogger.com/profile/17586513342346629237noreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-21282267327029286082016-03-22T09:16:56.867-04:002016-03-22T09:16:56.867-04:00So if you are far off from a planet it will still ...So if you are far off from a planet it will still trigger the biomes from being generated? Or is it more of a ratio between size and distance?Ajmhttps://www.blogger.com/profile/12985928729302303917noreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-64798984980542113322016-03-22T08:29:47.850-04:002016-03-22T08:29:47.850-04:00Yes you clearly got it. At the moment the system h...Yes you clearly got it. At the moment the system has only two levels. This is mainly because I believe generation rules may be different at different scales. For instance you may place biomes using a different logic than the one used to place spots in a leopard. The system only uses distribution maps for now to break down the next level of materials.<br /><br />There are no distance triggers, it is more a matter of scale. Depending on the size you assign to it the meta material to material conversion will happen at different levels.Miguel Ceperohttps://www.blogger.com/profile/17586513342346629237noreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-66190331870208719712016-03-22T07:57:04.655-04:002016-03-22T07:57:04.655-04:00I get where you are going with this. It is nested ...I get where you are going with this. It is nested materials all the way down. A planet meta material has just enough information to generate a unique set of biome neta materials, and each biome will have just enough information to generate their topological/feature data all the way down to individual object insrances, which will be textured with more metal materials that define things like 3d/2d bark/stone/leaf/etc textures.<br /><br />That said, I do have one question. Do meta materials have individual minimum distance triggers?Ajmhttps://www.blogger.com/profile/12985928729302303917noreply@blogger.com