Aller au contenu

Photo

[NEW] Blender-DragonAge Model converter


  • Veuillez vous connecter pour répondre
4 réponses à ce sujet

#1
BioSpirit

BioSpirit
  • Members
  • 261 messages
I have released a new model exporter/importer for Blender.  The project page is in here social.bioware.com/project/2312/

This model tool is designed for level models and static props like barrels, statues, including placeables. There can be no animation or moving parts. I have no intention to add support for things like that.

Here are some key features:

- Meta data generation for lightmapper.
- User defined walk mesh.
- User defined collision mesh.
- Pretty well automated "Post to Local". But there is still room for improvements.

If you have some comments or questions post them in here.

I created this application mostly because I couldn't get existing tools, available for Blender, to work properly.

Modifié par BioSpirit, 16 mars 2010 - 09:39 .


#2
BioSpirit

BioSpirit
  • Members
  • 261 messages
Looks like I forgot to make the files public once again. Downloads are visible now.

EDIT: There are some documentation in the package but I still  need to write a proper documentation in the wiki.

Modifié par BioSpirit, 17 mars 2010 - 01:11 .


#3
Eshme

Eshme
  • Members
  • 756 messages
Thanks.. Whats the modeller without tools ;) --> Making them himself.

#4
BioSpirit

BioSpirit
  • Members
  • 261 messages
Creating a lightmap UVPack

I wrote in the README.txt that it's good idea to use the "Lightmap UVPack" feature of the Blender to create a lightmap UVPack but I have to take that back. It would seem that "Unwrap (Smart Projections)" will do better. It doesn't use the space as efficiently as the "Lightmap UVPack" but it will produce better results with smaller lightmaps. 

#5
Tarnui

Tarnui
  • Members
  • 1 messages
You can use the "lightmap UVPack" with equal quality as "Smart Projection" but you need a higher margin value to avoid border sharing of pixels. I yous an margin value of 0.4 and have no problems with lightmap artefacts.