www.delorie.com/archives/browse.cgi   search  
Mail Archives: geda-user/2015/08/23/11:02:03

X-Authentication-Warning: delorie.com: mail set sender to geda-user-bounces using -f
X-Recipient: geda-user AT delorie DOT com
Date: Sun, 23 Aug 2015 17:03:30 +0200 (CEST)
X-X-Sender: igor2 AT igor2priv
To: "Peter Stuge (peter AT stuge DOT se) [via geda-user AT delorie DOT com]" <geda-user AT delorie DOT com>
X-Debug: to=geda-user AT delorie DOT com from="gedau AT igor2 DOT repo DOT hu"
From: gedau AT igor2 DOT repo DOT hu
Subject: Re: [geda-user] Antifork
In-Reply-To: <20150823142225.5557.qmail@stuge.se>
Message-ID: <alpine.DEB.2.00.1508231634400.6924@igor2priv>
References: <55D8D8B8 DOT 7050907 AT jump-ing DOT de> <CAM2RGhSZ1vi_DFKqZdZYxhto4ZaXLLscBt5m5kk+PH2ZoYW_vw AT mail DOT gmail DOT com> <alpine DOT DEB DOT 2 DOT 00 DOT 1508230609370 DOT 6924 AT igor2priv> <20150823051355 DOT 30150 DOT qmail AT stuge DOT se> <alpine DOT DEB DOT 2 DOT 00 DOT 1508230728050 DOT 6924 AT igor2priv>
<20150823142225 DOT 5557 DOT qmail AT stuge DOT se>
User-Agent: Alpine 2.00 (DEB 1167 2008-08-23)
MIME-Version: 1.0
Reply-To: geda-user AT delorie DOT com
Errors-To: nobody AT delorie DOT com
X-Mailing-List: geda-user AT delorie DOT com
X-Unsubscribes-To: listserv AT delorie DOT com


On Sun, 23 Aug 2015, Peter Stuge (peter AT stuge DOT se) [via geda-user AT delorie DOT com] wrote:

> gedau AT igor2 DOT repo DOT hu wrote:
>>> gedau AT igor2 DOT repo DOT hu wrote:
>>>> honestly, did you try to install a different version of glib in
>>>> your home using autotools and then compile pcb using that version?
>>>
>>> Yes. And not just with the native toolchain, but cross toolchains too.
>>
>> Out of curiosity, what exactly did you have to type for PCB's ./configure
>> to find the glib that was installed in your /home and not the glib
>> installed on your system?
>
> Set PKG_CONFIG_LIBDIR so that the /home glib .pc is found before the
> system one.
>
> If scconfig also uses pkg-config that helps compatibility a lot, but
> it is still only part of the story.

It tries pkg-config first, and falls back to other methods when that 
fails. It's also possible to manually inject the info in case of some 
libs (plan is to support this for all detections).

>
>
>>> A large number of package management systems support autotools. It's
>>> a very round wheel.
>>
>> I have experience only with debian's package management system. Since
>> scconfig offers a ./configure too, it's the same for packaging.> I think
>> it's more about prejudication: if it's not autotools, it must suck.
>
> I'm more pragmatic. What is important is that the interface is
> compatible and functionality is comparable, but I haven't gotten the
> impression that you want scconfig to support most autoconf configure
> options and if you do then you have the problem of always having to
> chase the other implementation to keep up with their functionality -
> which is no fun at all. :\ I think it would be much better to spend
> the effort on fixing what problems remain in autotools instead of
> essentially rewriting them.

Yup, different projects have different goals and different features. That 
often leads to different interfaces. It does have --prefix and I generally 
plan to implement the most used few features in a similar way.

Imo with autotools it's not about "fixing what problems remain". The 
problems I faced in practice always went down to design issues. This why I 
decided to try an alternative design.

>
>> sometimes even without looking.
>
> I did look. Not now, but before. Did I misremember? In that case I
> would like to extend an apology.

I can't remember, and I didn't mean specifically you. I meant users in 
general.

Totally off-topic: I once wrote an article in some local newspaper about 
how people could consider switching to open source software. IIRC it was 
an answer to another article that was about the license change of some 
proprietary software and how to work it around. As an obvious result, I 
got into endless flamewar with some guy who happened to be in love with 
one of the proprietary packages. Mail after mail he came up with a feature 
he thought the open soruce implementation didn't support, but it almost 
always turned out it is indeed supported. What I figured after some mails 
is that the main "problem" is that the open source software did not have 
the same menu system. So the actual show stopper was not the lack of 
features but the different menu layout (or UI design in general)!

Autotools is very popular. I do realize that many developers and packagers 
are familiar with the interface it offers. I also do realize the merits of 
a well known interface. However, I do not think that autotools is a good 
desing or the interface is particaularly conveninet or that autotools 
should have no alternatives.

>
> Even then, without a common interface anything else *is* worse, when
> a lot of infrastructure is built around the autotools interface.

I strongly disagree on this.

>
>
>>>> And did it work for the first attempt, without questions raised?
>>>
>>> Yes, every time, except for Windows, because of the guile problem.
>>
>> My bad experience with autotools comes from 6+ different UNIX systems.
> ..
>> Like once I spent days getting bash, screen, subversion and all their
>> dependencies compiled on a proprietary SysV-like UNIX of the late 90s.
>
> Here we are, 15 years later. I also had fun with proprietary systems,
> but I decided that they are too limited for me and that they waste my
> time. I don't blame autotools, I blame the proprietary system, so to say.

Partially right.

However, look at the actual bloat autotools introduce. Best to look at a 
smallish project right before the autotools transition and after it.

If we admit that autotools does not really solve most of the portability 
issues and it only will make the project work on Linux, some BSDs and 
maybe, maybe, maybe on windows... Well then, I'd rather have a 
Makefile.Linux, a Makefile.BSD and a Makefile.win32.


It's about finding a local optimum. Optimum 1: be it large, complicated 
and bloated, but then it absolutely must work out of the box on whatever 
system I have at hand.  Optimum 2: be it manually tailored to a given 
system, support only the 2 or 3 most popular systems, but at least be it 
small and easy to hack.

Of course this is my personal opinion, and there are as many tastes as 
many users.

>
> You're probably one of a few who have access to that system so you are
> in a fairly unique position to improve autotools for it. But I do
> understand that you don't; the cost/benefit ratio for corner cases
> isn't great.

Funnily enough, that's more like the normal case for me. I install at 
least 95% of my software from debian using apt, so I don't meet the build 
systems. For the rest 5%, I either want to do something tricky or it is on 
a non-debian system (which is most often non-Linux as well). Hacking the 
build system of autotools based projects is far from being easy. Getting 
autotools based project to compile on "exotic" systems too. So majority of 
the cases when I actually have to do something with autotools are the 
cases when it also breaks.

I'd happily fix it, if it was about minor implementation problems. But I 
really believe it's about a few major design issues. I sort of fixed it, 
by implementing a different design.

>> sometimes megabyte long generated shell scripts don't help at all.
>
> Yes, if the generated code is broken that's no fun. But why would it be?

Yeah, I'd ask "why would it break" too, scratching my head over the vt420 
or the telnet session.

>> The problem is not of technical nature. The problem is how much risk and
>> effort my potential user wants to take/invest.
>
> I think that's accurate, and it goes hand-in-hand with knowledge
> about using autotools or another configuration system to build
> software the desired way.

I think it goes far beyond the build system. My recent impression is that 
just checking out code from VCS or downloading a tarball and compiling and 
installing from source are already show stopper for many users.

I didn't feel this, uhm, 5 years ago. Don't know what's changed since, 
maybe the app-store thing?

<snip>

>>  - chroot is virtually unknown these days
>
> And not so neccessary.

I was referring to the case when people don't want to pollute their system 
with untrusted random software.


>> This made me think to try the "unpack this tarball as an user and run
>> ./start".
>
> Yes. See Ubuntu snappy apps AKA snaps.
>
> https://developer.ubuntu.com/en/snappy/tutorials/build-snaps/

Nice. Some ideas are similar to the simplicity of some BSD systems' 
packaing (where a package is just a tarball you unpack to /); some ideas 
are like /opt in FHS.

I go for something similar, but a bit more flexible.

First I wanted to build one big static executable. Turned out to be too 
much of a hassle, especially on the gtk side. So I figured I'd ship a 
dynamic executable and all the .so it linked to on my system (yes, yes, 
it's as ugly as it can get at this point!). I now have a script that uses 
ldd to collect all the libs in a directory.

However, the user may have some of the libs installed, in which case it 
may be benefical (or even critical) to use those. So I'll also ship a 
script that does the same ldd magic on the target machine on all the 
executables shipped, to figure out which libs are not found and keep only 
those from my distribution.

I am not yet sure this won't introduce FUBAR situations with mixed 
versions, but at least the user will have the option to revert to use all 
libs from the shipped tarball and then only a libc version should matter 
(or... who knows).

Ohh, and I aim to support doing all this in a simple home directory.

Regards,

Igor2

- Raw text -


  webmaster     delorie software   privacy  
  Copyright © 2019   by DJ Delorie     Updated Jul 2019