Hacker News new | past | comments | ask | show | jobs | submit login
Spaceship Generator (github.com/a1studmuffin)
850 points by mnem on June 18, 2016 | hide | past | favorite | 128 comments



Sadly it doesn't seem to generate UVs for the models. But using the cube projection in Blender seems to do an OK job.

Add a skybox and a couple of simple particle effects in PlayCanvas.

And we have some WebGL spaceships :-)

https://playcanv.as/p/kZtPZpnH/


Awesome!

I took a screen-shot, to help out the computationally challenged:

http://www.weegeeks.com/upload/a1studmuffin-daredevildave-sp...


Genuine question - why are UVs always a huge issue? Isn't it trivial to generate a UV from the actual 3D model? Or do you mean that they just don't give you an image that actually fits the model?


It's trivial to generate "some" UVs. However, in this case there is a texture with a lighting pattern as part of the generator. Ideally the model would be unwrapped so that the correct areas of the texture correspond to the correct area of the model.

In the case of an artist working from scratch the skill is generating UVs that allow the artist to easily paint the texture and provide an even area for each part of the model. Otherwise some areas will be stretched and blurry.


Pardon my ignorance, but what are UVs?


https://en.wikipedia.org/wiki/UV_mapping

Though daredevildave may have meant environment mapping instead.


Nope, that's right. There is no UV co-ordinates generated in the blender script. (Well, I think it uses world space cube for UVs, which is an option that PlayCanvas doesn't support).

I used the automatic Cube Projection unwrap in blender before exporting to FBX and importing into PlayCanvas.


i agree with that!


Thanks!


Texture mapping(U ~= X, V ~= Y).


nice mapping


yes it is


Holy crap that site basically stops my computer.


That feeling when your cell phone is better than someone's desktop.


Firefox yes. Safari no.

Chrome though would start opening everything on Google apps.


Hmm. I only buy X220s (almost 6 years old now) (because they are strong, long (15 hrs+ under linux) battery life and replaceable battery etc etc) and they are very cheap (I picked up a few for $70 per system) and was prepared for crash & burn but actually it runs this demo under Linux Chrome without issues. What GPU do you have?


Oops, also just realized there were some 4096 textures in there. I've resized them down.


Runs flawlessly on my 2015 MBP on it's native display.

Drag the window over to my cinema display, and holy 4fps and crash batman!


Interesting, could be the particle effects. They fall back to CPU if your machine doesn't support float textures.


Very cool -- what about sliders to feed parameters for new ship generation :)


Unfortunately, the ship generation is done offline in Blender. I'd have to re-write his ship generator in javascript.


That's pretty awesome.


UVs are not needed any more since a long time. They have many drawbacks (the first one being that they are explicit).

The renderers I used in production, when I left the VFX industry ca. 2010, all supported PTEX [http://ptex.us/ptexpaper.html].

Blender seems to have WIP support for it too nowadays [ https://wiki.blender.org/index.php/User:Nicholasbishop/Ptex]


They are still very much required for games/real-time rendering :-)

I hadn't heard ptex before, it's interesting stuff. Sounds like the reason it isn't in general usage for realtime stuff is the memory overhead: http://sebastiansylvan.com/post/casting-a-critical-eye-on-gp...


I love that the ships look more like "skyscrapers in space" then airplane like spaceships we usually see in tv shows/movies.

This is most likely (according to multiple hard scifi authors) a much more realistic depiction of how spaceships are going to look like.

Example given: The Expanse - "Flip and Burn" https://www.youtube.com/watch?v=X4EiW1bHwsQ


I love that the ships look more like "skyscrapers in space" then airplane like spaceships

(than!)

To be really realistic, the planes of the floors should be oriented perpendicular to the axis, or there should be a centrifuge incorporated. If ships are going to float around like ocean going ships floating on top of an invisible ocean, with all of the ship floors somehow aligned, then this is the "space is an ocean" trope.

http://tvtropes.org/pmwiki/pmwiki.php/Main/SpaceIsAnOcean


yes i meant the floors are like the floors in skyscrapers. the thrust/acceleration creates the "artificial gravity".


See Mistake 6 at http://www.dedoimedo.com/physics/sci-fi-mistakes.html

Mistake 7 also hints about the possible shape when there are no preferred orientations.

Also interesting

http://www.popularmechanics.com/space/deep-space/a8140/what-...

http://www.theregister.co.uk/2011/11/24/spaceship_design/


Since you linked these pages, you and others also might find this interesting:

http://www.projectrho.com/public_html/rocket/misconceptions....

In my opinion Atomic Rocket contains the most accurate discussion of hard science fiction I've ever encountered.


I just spent the last 6 hours reading a fraction of the pages on that site. Thanks for the link!


I don't get his complaint against windows given that current space ships have windows. Admittedly, many ships overdo them in scifi, but you wouldn't even need to go back a hundred years to find people who would have said that there was no way to produce glass sheets strong and large enough for its current use in skyscrapers. Windows are only a problem for near-future hard scifi, and then only when they're particularly large or prevalent.


Some of the designs remind me of the Earthforce ships from Babylon 5, a few of the carriers in the Wing Commander games, or some of them in Sins of a Solar Empire.


The very first picture, the first thing I thought of was the Agamemnon from Babylon 5. Some of the extreme examples remind me of Shadow ships.


Excellent show, and an even better series of hard sci-fi books to back it. I've never read a sci-fi book/series that depicts realistic high-g travel before The Expanse.


To me, "skyscrapers in space" feels like an attempt to capture the majesty of World War I and II battleships. Which is to say, not much better justified than the aerodynamic spaceship. I don't see why a hard-SF spaceship really needs to be big when it would still be quite fragile (given the energy of an orbital-velocity collision).

That is, unless you design your future weapons and FTL drive principles so they really do reward "bigger is better" (I'm on to you, David Weber.)


Same here, love the look.

Some of those generated ships gave me serious 'Event Horizon' vibes: https://www.youtube.com/watch?v=eiE-NB-kxh8



It'd be more realistic as well! As space is a vacuum we don't need to apply aerodynamics to the ships and they can be any which design!


True, but then Endurance from Seveneves had to evolve rather 'aero'dynamic (oblong) shape to protect from space debris by the huge piece of metal in the nose. If I interpret the text correctly, of course.


You still have the problem that heat from the sun accumulates differently in each part of your ship causing stress on the hull.


For some reason I completely missed that show, it looks great. Thanks!


I saw the headline and initially assumed it would be some droll Game of Life invention. Then I thought, no, maybe its some automatic sci-fi art generation script. And I was rewarded!

Procedural generation like this is quite possibly key to the future of indie games- if you don't have the team to design large sets of art assets, its important to be able to put something pretty out there using your own wit. (A good example would be No Man's Sky.)


For me, the major impact of procgen is not the effort or cost, but the translation of skillset. I can handle gigantic, difficult algorithm design problems, and still maintain engagement and happiness. If I have to do even rudimentary art or level-editing tasks for any length of time, I quickly become miserable and quit.

Although, I do also happen to believe that ultimately, procgen has much more absolute potential than hand-made content. Minecraft is only 6 years old, and imo people have barely begun to expand on the basic concept.


You are severely underestimating the effort good procedural content takes.


And perhaps you are severley underestimating the effort large-scale world assets take to create.

I agree that doing procedural generation properly takes a lot of work, but there's a reason that AAA open-world games like GTAV cost hunderends of millions of dollars to develop.


There's a reason that AAA open-world games like GTAV are using rather limited procedural generation, too.

The problem is that to create N pieces of satisfyingly unique procedurally generated content, you need M pieces of content that were carefully designed within additional restrictions to make them combineable in various fashions - making them harder to create. Then you add code to glue that all together. If you think M=0, that just means you've hardcoded your content in code.

When N >> M - e.g. you're generating thousands of similar-yet-distinct roguelike levels, trees, etc. - procedural generation can pay off.

But even for large-scale world assets, it's frequently the case that N < M. Making the content non-procedurally is going to be cheaper - fewer limitations to worry about, and fewer pieces of content to actually generate. A few clever tricks - rotating, retexturing, tweaking a base model, etc. - let you make a lot of non-procedural content without creating each piece from scatch, and lets you stretch that content to appear like you have more variations than you actually do.


I don't think No Man's Sky will show a significantly better RoI, in the long run. Star Citizen is certainly racking up the bills.

We could even argue that AAA game studios using hand-modeled art assets instead of procedurally generated content is proof that hand-modelling is the "cheaper" route. Though most AAA game studios are using a hybrid approach and have some form of procedurally generated content in their pipeline, whether it's to create quests or to create the baseline environment on top of which bespoke content is added.

Projects are hard. Full stop.


But it does scale better, doesn't it? Seems that would be the case if more tools and libraries like this one were open-sourced.


Not really. The problem with procedurally generated content is making interesting iterations. Sure, you can have a million space ships just as easy as one, but will anyone care about them? It is a different sort of work, but of equal effort, to "just" modeling the content.


Even most landscapes on Earth (reality is "procedurally generated" by the laws of physics) are pretty boring. Spectacular documentaries must travel far to reach the interesting ones.

Interesting procedural generation may require strong AI, because it must model how it is interpreted by us.


> Even most landscapes on Earth (reality is "procedurally generated" by the laws of physics) are pretty boring.

Strongly disagree! I truly feel sorry for you if you can't see beauty and spectacle everywhere you look in nature.


Well, for now, yes- but things like this tend to get easier as new techniques are perfected and more powerful tools are developed, whereas building a thousand art assets by hand will remain time-consuming indefinitely.


Not only for indie games, mainstream games like Stellaris could be vastly improved by this. I already posted a link to their forums. :)

https://forum.paradoxplaza.com/forum/index.php?threads/space...


Procedural generation of mesh is difficult, but not costly (in terms of CPU, drawcalls or memory, textures or total polygon/tris in the combined mesh).

Many, many other games use modular assets, especially if poly and tri counts are important to keep down for mobile and low power consoles / handhelds.

With a few component models that can be separated and added to a base model, procedural mesh generation can work. But, it is extremely messy when starting out, just due to complexity and the strategy of a significant redesign of assets to be modular, not singular.

Once designers and artists are on-board, and that's half of the battle initially, in getting artists to make modular design for assets instead of single design of flagship or lead assets, then you need to create code to make the game actually work with Frankenstein ships or Frankenstein characters, buildings, etc.

it leads to intriguing art and possibilities.

And working out all of the intricacies of positions, storage, blending, replacement/deformation of modular assets, how to diagnose, test and display problems, modifying or identifying pre generated or fixed module assets in the gui, storing the atomic position and/or physics settings, looking for deformed mesh, hollow mesh/gaps, raytrace problems, wrap uv and deformation, problems with textures, etc. Batch calls, pooling, spawn locations, collision mesh, sic.

And that's just things I could think of from regular assets. Modular mesh, as well as modular textures are usually a no go area, because of the problems with the engine and performance guesswork when problems come up at the last minute. Or the first minutes. Etc.

Most indie games don't have the time to waste on procedural generation unless it's been troubleshooted or tested by designers and coders, or sold as an asset by others.

Buildings are often generated like this in game,

No Man's Sky is perhaps the more popular ur example of how to make procedural generated assets using math functions and geometric texture generation (it's been a long time since I looked at the tech they were supposed to be using under the hood, but i misremember something like fractal/geometric generation of textures to minimize the workload)

but it is still a tech demo of a well understood, but infrequently used tool.

That is, until they release the game.


That would be No Man's Sky®.


I don't really like most procedural generation. It has no meaning and the results are not intellectually stimulating. At best you can't spot the pattern and parameters but you usually can after a few examples.

An idea I am more interested in is that you generate requirements and use an optimiser to solve the actual design. This way, there is a hidden "why". With some study, a human might be able to discern why x is so thick or why A is attached to B. When a design has a use in mind then it has meaning.


Yeah, as I was watching the animation for the algo, I was thinking how nice it'd be if you added purpose to all these segments and features, then produced ships for various applications through fitness algorithms and evolution.

Simulate a small space economy, create a DAG of how components and materials are produced, simulate situations this ship will be placed in and calculate net present value, see the outcome in the difference between a freighter and a fighter.


Would be pretty funny to see a genetic algorithm go to play on that. I imagine it would end up something like most people screwing around with a spaceship builder "and what if we add forty engines?"


Do you have any practical examples of what you mean? It seems obvious, but getting a computer to understand requirements, and also solve for them and build a realistic 3d model? Seems like an impossible task for current technology.


> I don't really like most procedural generation.

The demo scene has some mind-blowing stuff.


Demoscene is awesome but I see one off generation as a type of compression. What I have in mind is games where procedurally generated families of things are supposed to be interesting and worth exploring and discovering. I find those underwhelming.


I agree that most procedural generation is underwhelming. Merely remixed/varied things isn't exciting. However, things that have some sort of function -- things that have evolved or have been curated and tweaked -- these things are often very interesting. Procedural generation needs to be attached to such mechanisms and provide tool towards a goal to be interesting.


Check out No Man's Sky: https://www.youtube.com/watch?v=D-uMFHoF8VA&feature=youtu.be...

An entire galaxy of planets, fauna and creatures (including shape, coloring and even voices) is procedurally generated!


The point of procedural generation is to have designs that are not designed by a designer so they can surprise us in unusual ways.


The point is to create lots of useful content, it doesn't have to be intended or not designed.

Procedural generation can be entirely predictable. Also, if it doesn't fit the eyes of the designer (who could also be the programmer) then the algorithm or parameters will be changed until they do. In that way, most procedurally-generated content is designed.


I was thinking something like this in implementation would be a "race" parameter, or something that would generate random ships, but with identifiable traits across the fleet. You get a sort of intellectual stimulation there from a cultural aspect (Klingon design is decidedly different from Romulan), but without the minutiae of actual designing ships.


Yes, this is the eternal 'intelligent design' vs. 'natural process' question, in a new light. I agree that artifacts look more convincing if they are designed (by a human), but there is still no reason to object to using a computer to help generate a multitude of variants of something based on the original design (constraints), from which one can choose what they like. At the same time, there is nothing wrong with (much) less involvement on the part of a human in generating natural objects - landscapes, trees, planets, or even animals, as nobody has designed (originally) these things in the real world.


I think (maybe) you're arguing for using genetic algorithms for generating these things, which would be very cool. Unfortunately, modeling realistic requirements is generally harder than most programmers would think.


Your idea is interesting so I tried to think of things that employed it in the past.

Basically, you're arguing for stronger creation tools for a narrow ___domain.

The spore creature creator seems to be an example. It was awesome for the variety and I wish that portion of the game could have been taken up more my the industry as a content creation tool for devs.

At the same time, models in spore look pretty same-y - as is the problem with all procedural content.

Still, same-iness is not always a bad thing.


I find the arguments of procedural vs artistic content somewhat humorous given that artistic content is just an attempt to create something that is a fascimile of actual natural and human generated artifacts. The natural world is by definition procedural (unless you believe in a Creator). The human artifacts have some artistry - but only in the broad strokes. Most buildings, and most details of architected buildings, follow a procedural semantic. The natural decay of human artifacts follow a procedural semantic.

I assume that in the near future of VR that few will play "AAA games" (and hence few will exist) because they won't be able to complete with free procedurally generated environments.


I don't know why you're getting downvoted, because you're absolutely right. Most human architecture is highly procedural, with many tradesmen specializing in highly repetitive, focused tasks, that have predictable, reliable outcomes, in order to produce safe structures based on well-understood principles.

As for the VR thing, I think that will gain less traction because the idea of sleeping one's life away, strapped into goggles and headphones, is only so attractive.


Most human architecture is highly procedural, with many tradesmen specializing in highly repetitive, focused tasks, that have predictable, reliable outcomes

Yes, but human architecture is shaped by forces following from function. (As in Alexander's Pattern Language of Design.) That's why it becomes interesting, because we can relate to how the architecture relates to human needs and perceptions.


I find Blender quite handy when it comes to (randomized/parameterized) generation of 3D objects for the purpose of rendering, although I've only done really simple things. However the API, while in Python (which is good), feels very unpythonic and clumsy, all while being severely under-documented. It would be really nice if the API could be cleaned up in a future major release...


Cool, but the ships are all rather samey. They all spring from the root hull, with a main body being longer than it is wide. Perhaps that is needed to conform with our terrestrial concept of "ship". But I think it could be improved by randomizing the number or shape of starting hulls. It would also be interesting to see what could be done by extruding spherical shapes rather than boxes.


Very cute, although the ships look samish (would probably be interesting to start with different base templates etc.

Still, it's only a few steps away from "random British 1970s SF book cover".


Do you have a link to the book cover generator? Or is that a metaphor?


I think he might have meant Chris Foss:

http://www.chrisfossart.com

I grew up imagining these spaceships becoming reality, and now .. they sort of are .. at least, I can bend and play with spaceships just like them now, albeit .. in Blender .. ;)


It's a joke — most British SF novels (and UK editions of US SF novels) seemed to get published with covers featuring some kind of giant spaceship whether there was one in the story or not. Chris Foss is, per another reply, the most well known artist of the genre.



Also reminds me of Peter Elson, though his designs are more curved. A few smoothing or relaxation passes might achieve something similar.

http://www.peterelson.co.uk/gallery/category.php?pageNum_pro...


The selected examples got me thinking: Those eight are presumably examples the author felt turned out particularly well. Could one build a neural network trained on generated ships that "turned out well" to automatically generate better-looking random ships?

Could this sort of process be used in games where procedural generation might otherwise be rejected because it looks "too random"?


You can think of the random possibilities as exploring one path through a huge tree of possibilities. You can then pick 9 paths for the next step and have the user choose the one they prefer and keep repeating until the user feels that they're done.

We used to use this sort of approach to produce random shapes for animation and display on large screens in clubs in the mid-90s. You can also animate by tweening one or more parameters at particular branching points or between branches. If you pick the right places on these ships that could produce some interesting movement in the craft as well.


> Could one build a neural network...

Short answer: yes.

Long answer: yes, but you'd have to retrain it for each style/form/concept that "well" was defined for.

But yes, it's possible. Would be a lot of work though.


The extreme examples remind me so much of the Shivan ships in the Freespace series.

I don't think the script replaces a professional designer, but this is awesome for brainstorming ship ideas.


My physics knowledge is pretty weak. Wouldn't you want a space ship to be closer to something like an oblate spheroid? Less surface area <-> volume ratio builds cheaper, lighter space ships presumably? The primary thing I'm unsure of is steering, but how much of a problem can that be?


Pure speculation, but I think heat dissipation is probably the more important bottleneck of spaceship design -- no convective cooling, so you potentially want more surface area to increase radiation.

I think aerodynamics (both lift to weight and streamlining) are the reason for the compact design of aircraft and submarines (also perhaps reducing the profile for military vehicles).

Weight maybe an issue for maneuvering in combat, but there's tons of structural material literally floating around in space, so I'd wager that base material cost would not a limiting constraint.


Steering is no problem; in space everything rotates about the center of gravity, regardless of shape.

You probably would have boxy starships for the same reason you have boxy houses: more usable space. The same arguments apply for spherish houses, but those never caught on. Given the distance that the starship is going to travel, the main cost is probably not the materials so much as the time, so you'd want to maximize the amount of stuff you could ship back and forth.


Something to do this in Lego Digital Designer would get considerable interest.


I did pretty much that for a previous challenge in /r/proceduralgeneration - procedural castles, where I used LDraw as backend for rensering.


Great work! I love a lot of the ships you show. I'll have to look at the code next weekend. I liked the city-building scripts from years ago in Blender too. Fun stuff to create a ton of assets automatically.

I was working on a procedural art generator in Blender in 2006, and I tried to use genetic programming written in Lisp to create random parameters into a fixed generator I had written in Python copied from a parametric formula renderer. I can't remember the original author. I couldn't get it to work well, and got sucked into Processing shortly thereafter, and other things Blender.

You could meld your script with a Genetic Program to present quick renders it evolves, and use Neural Nets to drive towards what you like, and away from what you don't to evolve a design. This cuts the search space, and thus the time, down from a simply random form generator in producing images you may want.

You've killed my next weekend!

Again, great work!


Form follows function! No wait... I think I have it backwards


Awesome stuff. Would love to see concept artists incorporating procedural generated assets in their workflow. Produce 100 samples like this, teach the computer which ones they prefer, produce 100 more, take a few and refine them by hand.


A lot of artists in various industries use Alchemy to generate ideas like this.

I like to use the ms reading on my stopwatch when I don't have anything else nearby. I'll draw up a few scales like inorganic <---> organic, long <---> short, heavily armed <---> unarmed, and then get a 0-99 ms reading for each by pausing my stopwatch.

Edit: Example. https://www.instagram.com/p/BG0Gr5BRBs7/

The same can be done for character gen, scenario gen, life choices :), etc.


When I tried to run it, I got:

Traceback (most recent call last): File "/spaceship_generator.py", line 737, in <module> File "/spaceship_generator.py", line 711, in generate_spaceship AttributeError: 'BevelModifier' object has no attribute 'offset_type'

I ran it with Blender 2.69


Works with 2.77a though.


Very macross of you.


Dawkins wrote the biomorph software to demonstrate evolution, you got to select squiggles and breed them.

What would be call is to evolve spaceships by selective breeding.

No doubt this being The Internet this was done by a Russian seven years ago, and I failed to GTFA.


Has anyone 3D printed one of these models?

http://www.shapeways.com/tutorials/prepping_blender_files_fo...


Thing is, /r/proceduralgeneration is running a monthly challenge for exactly this right now but I haven't seen it listed as entry yet. Check out the previous entries, some of them are neat (others less so).


It's the top link on the challenge thread, isn't it?


I'm delighted by how little code this is!


Someone should be working on spaceship guidance software with UI interfaces for humans. We're going to need it.


This … this is awesome. Thanks for sharing!


If this works as "advertised" then this is amazing. I'd love to see a game implement this.


For a second I had to look if this was from the "limit theory" guy but no, RIP LT.


Awesome work! Now I need website where I can just generate them and view them in webgl.


Oh wow - I can procedurally generate 3D models ?!!! Blender tutorial needed :-)


Coincidentally, Blender's UI was procedurally generated as well.



Blender has python scripting support, its pretty good actually.


I'm guessing not the algorithm used to create the Destiny.


Why are there windows on a spaceship?


Because windows aren't unheard of even now on spaceships?

http://atlcoin.com/atlcoinblog/wp-content/uploads/2011/09/UF...


https://youtu.be/C-qEmmpGYvA?t=36s

Seems to be human nature.


Was that a historically accurate representation of the spacecraft designer's German accents, or was the movie gratuitously racist?


I am more impressed by how compact the propellant tanks must be... ;-)


This is very impressive. Nice work.


Could you do procedural cars?



There is an error in script.

seed = 'tweer' obj = generate_spaceship(seed)

Python can't redefine functions as variables. Seed is a function.


i'm not the author but i frequently code in python and am unable to understand what you're saying from that snipped of code.

1. you can overwrite everything in python

2. seed has been set to a string

3. obj uses the value of previously set 'seed' variable, which should be a string at that point.

its not a good idea, like overwriting the id variable, but it should work...

and just in case somebody wonders: its still possible to use the "true" seed() by calling it over __builtins__.seed()

i did similar stuff previously without realizing and only found out about this after i switched to a real IDE that warned me about overwriting inbuilt functions


'seed' is explicitly imported on line 18, then overwritten on line 736, and yet it's used as a function on line 526. As far as I can tell, that is an error.

    >>> from random import seed
    >>> def f(): return seed(10)
    >>> seed = 50
    >>> f()
    Traceback (most recent call last):
      File "<pyshell#4>", line 1, in <module>
        f()
      File "<pyshell#2>", line 1, in f
        def f(): return seed(10)
    TypeError: 'int' object is not callable
    >>> 
I haven't run the script, so take this with a grain of salt.

EDIT: Since generate_spaceship is always called with an empty random_seed parameter, this error won't happen when the script is run.


Yes, that's correct.

    >>> from random import seed
    >>> seed(10)
    >>> seed
    <bound method Random.seed of <random.Random object at 0x24721c0>>
    >>> seed = 50
    >>> seed
    50
    >>> from random import seed
    >>> seed
    <bound method Random.seed of <random.Random object at 0x24721c0>>
(You can't get it as __builtins__.seed because it's not a builtin, but you can re-import it from random to get it back.)


> Start with a box.

And end with some boxes stuck together.


This is a pretty standard 3D modelling strategy, unsurprisingly, called "box modelling". You can create an incredible amount of things with box modelling. Including, obviously, a lot of spaceships. I built several crud spaceship designs with box modelling back in school. The main key advantage, is all of your 'parts' are built off of a single object in the editor, and aligned perfectly.

The other thing though, you can do to a box modelled project, is round all the edges to varying degrees, and end up with something looking much more fluid and amorphous. However, I suspect the results would be far more unpredictable if he through something like that into this script, so it'd be difficult to go there procedurally.


Yeah back when I made 3D models of star trek spaceships (trekkie in your name significant?) that was one of the go-to methods for making the shuttles, smaller craft, borg (obviously), but stuff like Voyager and the Enterprise-E you would often resort to more advanced lofting/spline patch based techniques and work off of a sketch.


Pretty much.

I used to enjoy making 3d spaceships in Blender3d and such. Simple extrusion and resizing of faces is the path of least resistance but it doesn't lead to very nice results.

The mesh should follow the shape you wish to create, not have the shape be dictated by the geometry of the mesh.


Seems to me to be an excellent reduction of the actual work required to build a decent spaceship. :)




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: